[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki


[ Back to the navigation ]

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Last revision Both sides next revision
courses:rg:2014:mdsm [2014/11/25 21:28]
nguyenti
courses:rg:2014:mdsm [2014/11/29 02:02]
nguyenti
Line 1: Line 1:
-1. Recall from the paper presented by Tam week ago.  +You should focus on the first paper (skip section 2.3) 
-a. What is the difference between distributional semantic and distributed semantic representation? +The second paper, an extent of the first one, is optional reading.
-bWhat is maximum size of word vector in distributional representation approach?+
  
-2+ 
-a. Compute the similarity between two words"Moon" and "Sun" from the co-occurrence matrix below:+Q1.  
 +a)Recall the paper about word representations presented by Tam on November 10. 
 +Read http://www.quora.com/Whats-the-difference-between-distributed-and-distributional-semantic-representations 
 + 
 +(M_{w,d} is a matrix with w rows and d columns). 
 +What does w, d and k mean? 
 +What are the values of w, d and k used in the experiments in this paper? 
 + 
 +b) What is maximum dimension of a word vector in distributional representation approach? 
 + 
 +Q2. 
 +a) Compute the similarity between two words "Moon" and "Mars" from the co-occurrence matrix below
 +Use these raw counts (no Local Mutual Information, no normalization) and cosine similarity.
  
            | planet | night | full | shadow | shine                   | planet | night | full | shadow | shine       
Line 12: Line 23:
     Mars     44     23  |  17  |      |   9     Mars     44     23  |  17  |      |   9
          
-b.+b) How do they deal with high dimension of vectors in those papers? 
 +Can you suggest some other techniques to manage vector dimension?
            
-3. What are Bag of Word (BOVW) and Bag of Visual Word (BOW)? Are they synonyms?+Q3 
 +a) What are Bag of Word (BOVW) and Bag of Visual Word (BOW)? 
 +b) How do they apply BOVW to compute representation of a word (concept) from a large set of Images? 
 +    
 +Q4. 
 +When they construct text-based vectors of words from DM model 
 +they mentioned Local Mutual Information score. (section 3.2, also section 2.1 in the 2nd paper) 
 +So what is that score? Why did they use it?
  
-4. How do they apply BOVW to compute representation of a word (concept) from a large set of Images?+Q5: 
 +Have you ever wished to see beautiful "Mermaids"? 
 +Have you ever seen "Unicorns" in the real life? 
 +"Assume that there are no photos of them on the Internet"
  
-5. Have you ever wish to see a beautiful Mermaids. +Think about a computational way to show that how they look like?
-Have you ever seen "Unicorns" in the real lie. +
-Can you think a computational way to show that how they look like?+
  
  

[ Back to the navigation ] [ Back to the content ]