Differences
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
courses:rg:2014:mdsm [2014/11/25 22:22] nguyenti |
courses:rg:2014:mdsm [2014/11/29 13:52] (current) popel |
||
---|---|---|---|
Line 1: | Line 1: | ||
+ | You should focus on the first paper (you can skip section 2.3): [[http:// | ||
+ | The second paper [[http:// | ||
- | Q1. Recall | + | Q1. |
- | a) What is the difference between distributional semantic and distributed semantic representation? | + | Recall the paper about word representations |
- | b) What is maximum dimension | + | Read http:// |
+ | |||
+ | (M_{w,d} is a matrix with w rows and d columns). | ||
+ | What does w, d and k mean? | ||
+ | What are the values | ||
Q2. | Q2. | ||
- | a) Compute the similarity between two words " | + | a) Compute the similarity between two words " |
+ | Use these raw counts (no Local Mutual Information, | ||
| planet | night | full | shadow | shine | | planet | night | full | shadow | shine | ||
Line 14: | Line 21: | ||
Mars | Mars | ||
| | ||
- | b) How do they manage size of dimension of vectors in those papers? (looking at the 1st paper) | + | b) How do they deal with high dimension of vectors in those papers? |
- | Do you think it is a bit disadvantage? | + | Can you suggest some (other) |
- | Can you suggest some techniques | + | |
Q3. | Q3. | ||
- | a) What are Bag of Word (BOVW) and Bag of Visual Word (BOW)? Are they synonyms? | + | a) What are Bag of Word (BOVW) and Bag of Visual Word (BOW)? |
- | b) How do they apply BOVW to compute representation of a word (concept) from a large set of Images? | + | b) How do they apply BOVW to compute representation of a word (concept) from a large set of images? |
- | (note: they used some different visual features in two papers) | + | |
- | Q4. When they construct text-based vectors of words from a corpus in 2nd (section 2.1) | + | Q4 (bonus). |
- | they mentioned LMI score. | + | When they construct text-based vectors of words from DM model |
+ | they mentioned Local Mutual Information score. | ||
+ | So what is that score? Why did they use it? | ||
+ | |||
+ | Q5 (bonus). | ||
+ | Have you ever wished to see beautiful " | ||
+ | Have you ever seen " | ||
+ | " | ||
- | Q5: Have you ever wished to see a beautiful " | ||
- | Have you ever seen " | ||
Think about a computational way to show that how they look like? | Think about a computational way to show that how they look like? | ||
+ | |||