[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki


[ Back to the navigation ]

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
Last revision Both sides next revision
courses:rg:2014:wr [2014/11/04 01:24]
hoangt created
courses:rg:2014:wr [2014/11/04 01:26]
hoangt
Line 2: Line 2:
  
 1. In three main types of word representations described in the paper, to which types the following two samples belong: 1. In three main types of word representations described in the paper, to which types the following two samples belong:
-a) dog -0.087099201783 -0.136966257697 0.106813367913 [47 more numbers]+a)  
 +   dog -0.087099201783 -0.136966257697 0.106813367913 [47 more numbers]
    cat -0.103287428163 -0.0066971301398 -0.0346911076188 [47 more numbers]    cat -0.103287428163 -0.0066971301398 -0.0346911076188 [47 more numbers]
-b) dog 11010111010+b)  
 +   dog 11010111010
    cat 11010111010    cat 11010111010
-    +2. Section 4.1 defines a corrupted (or noise) n-gram, but there is a tiny error/typo in the definition. Try nitpicking and point it out.
-2. Section 4.1 defines a corrupted (or noise) n-gram, but there is a tiny error/typo in the definition. Try to be nitpicking and point it out.+
  
 3. Section 7.4 states that "word representations in NER brought larger gains on the out-of-domain data than on the in-domain data." Try to guess what is the reason. 3. Section 7.4 states that "word representations in NER brought larger gains on the out-of-domain data than on the in-domain data." Try to guess what is the reason.

[ Back to the navigation ] [ Back to the content ]