[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki


[ Back to the navigation ]

This is an old revision of the document!


Table of Contents

N-best Reranking by Multitask Learning

Kevin Duh, Katsuhito Sudoh, Hajime Tsukada, Hideki Isozaki, Masaaki Nagata
http://aclweb.org/anthology-new/P/P10/P10-1160.pdf
Kevin's slides
ACL 5th Workshop on Statistical Machine Translation (WMT) 2010

Suggestions for the presenter

It would be great to have an illustrative but simple example of N-best list and also examples of features and examples of labels (to specify the terminology).

Comments

Opinions on the paper

TODO: suggestions to solve/comment

Research group suggested that they extract only those features that has a nonzero weight in any of W.

Comments by M. Popel:
Feature pruning using a treshold: When you have limited data, according to this work it worth to try a good feature than to set a treshold.

We were arguing about the number of features used in sets. It is unlikely that they could somehow get the fixed number of features.
(I suppose that it is just number of input features, if they were really used is not clear.)

Every feature is only fired at the sentence where the conditions are met.
Example: 500 sentences, every sentence has just one N-best list. That means 500 weight vectors

We argued about hashing the features together - in what way are they hashed?


[ Back to the navigation ] [ Back to the content ]