[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki


[ Back to the navigation ]

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
courses:rg:2014:mdsm [2014/11/25 21:22]
nguyenti
courses:rg:2014:mdsm [2014/11/29 13:52] (current)
popel
Line 1: Line 1:
-1. Recall from the paper presented by Tam week ago +You should focus on the first paper (you can skip section 2.3): [[http://www.aclweb.org/anthology/W11-2503.pdf|Distributional semantics from text and images]]. 
-   aWhat is the difference between distributional semantic and distributed semantic representation? +The second paper [[http://www.aclweb.org/anthology/P12-1015.pdf|Distributional Semantics in Technicolor]], an extent of the first one, is optional reading.
-   bWhat is maximum size of word vector in distributional representation approach?+
  
-2. What is the vector representation of word "Moon" and "Sun" 
-from co-occurrence matrix below: 
-       | planet | night | full | shadow | shine  
-Moon     34     27  |  19  |      |   20 
-Sun    |   32     23  |  10  |   47     15 
-Dog    |      |   19  |  2     11     1 
-Mars     44     23  |  17  |      |   9 
  
 +Q1. 
 +Recall the paper about word representations presented by Tam on November 10.
 +Read http://www.quora.com/Whats-the-difference-between-distributed-and-distributional-semantic-representations
  
-3. a. What are Bag of Word (BOVW) and Bag of Visual Word (BOW)? Are they synonyms+(M_{w,d} is matrix with w rows and d columns). 
-   b. How do they apply BOVW to compute representation of a word (concept) +What does w, d and k mean
-   from a large set of Images?+What are the values of w, d and k used in the experiments in this paper?
  
-4.+Q2. 
 +a) Compute the similarity between two words "Moon" and "Mars" from the co-occurrence matrix below. 
 +Use these raw counts (no Local Mutual Information, no normalization) and cosine similarity.
  
-5. Have you ever wish to see beautiful Mermaids. +           | planet | night | full | shadow | shine        
-Have you ever seen "Unicorns" in the real lie. +    Moon     34     27  |  19  |      |   20 
-Can you think a computational way to show that how they look like?+    Sun    |   32     23  |  10  |   47     15 
 +    Dog    |      |   19  |  2     11     1 
 +    Mars     44     23  |  17  |      |   9 
 +     
 +b) How do they deal with high dimension of vectors in those papers? 
 +Can you suggest some (other) techniques of preprocessing vectors with high dimensions? 
 +      
 +Q3 
 +a) What are Bag of Word (BOVW) and Bag of Visual Word (BOW)? 
 +b) How do they apply BOVW to compute representation of a word (concept) from a large set of images? 
 +    
 +Q4 (bonus). 
 +When they construct text-based vectors of words from DM model 
 +they mentioned Local Mutual Information score. (section 3.2, also section 2.1 in the 2nd paper) 
 +So what is that score? Why did they use it? 
 + 
 +Q5 (bonus). 
 +Have you ever wished to see beautiful "Mermaids"? 
 +Have you ever seen "Unicorns" in the real life? 
 +"Assume that there are no photos of them on the Internet" 
 + 
 +Think about a computational way to show that how they look like?
  
  

[ Back to the navigation ] [ Back to the content ]