<?xml version="1.0" encoding="UTF-8"?>
<!-- generator="FeedCreator 1.8" -->
<?xml-stylesheet href="https://wiki.ufal.ms.mff.cuni.cz/lib/exe/css.php?s=feed" type="text/css"?>
<rdf:RDF
    xmlns="http://purl.org/rss/1.0/"
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
    xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
    xmlns:dc="http://purl.org/dc/elements/1.1/">
    <channel rdf:about="https://wiki.ufal.ms.mff.cuni.cz/feed.php">
        <title>ufal wiki courses:rg:2014</title>
        <description></description>
        <link>https://wiki.ufal.ms.mff.cuni.cz/</link>
        <image rdf:resource="https://wiki.ufal.ms.mff.cuni.cz/lib/tpl/ufal/images/favicon.ico" />
       <dc:date>2026-04-24T22:35:06+00:00</dc:date>
        <items>
            <rdf:Seq>
                <rdf:li rdf:resource="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:bleu?rev=1415965762&amp;do=diff"/>
                <rdf:li rdf:resource="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:crf?rev=1414502532&amp;do=diff"/>
                <rdf:li rdf:resource="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:entity?rev=1417206684&amp;do=diff"/>
                <rdf:li rdf:resource="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:kernels?rev=1419932181&amp;do=diff"/>
                <rdf:li rdf:resource="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:mdsm?rev=1417265570&amp;do=diff"/>
                <rdf:li rdf:resource="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:memm?rev=1413813879&amp;do=diff"/>
                <rdf:li rdf:resource="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:perceptron?rev=1413239320&amp;do=diff"/>
                <rdf:li rdf:resource="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:start?rev=1418031881&amp;do=diff"/>
                <rdf:li rdf:resource="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:wr?rev=1415108508&amp;do=diff"/>
            </rdf:Seq>
        </items>
    </channel>
    <image rdf:about="https://wiki.ufal.ms.mff.cuni.cz/lib/tpl/ufal/images/favicon.ico">
        <title>ufal wiki</title>
        <link>https://wiki.ufal.ms.mff.cuni.cz/</link>
        <url>https://wiki.ufal.ms.mff.cuni.cz/lib/tpl/ufal/images/favicon.ico</url>
    </image>
    <item rdf:about="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:bleu?rev=1415965762&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2014-11-14T12:49:22+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>courses:rg:2014:bleu</title>
        <link>https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:bleu?rev=1415965762&amp;do=diff</link>
        <description>You can skip sections 4 and 5 in the paper.

1) Section 2.1.1. defines p_n as a fraction where the denominator is “the number of candidate n-grams in the test corpus”.
Compute this denominator for p_3 and a test corpus with three sentences with lengths</description>
    </item>
    <item rdf:about="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:crf?rev=1414502532&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2014-10-28T14:22:12+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>courses:rg:2014:crf</title>
        <link>https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:crf?rev=1414502532&amp;do=diff</link>
        <description>Conditional Random Fields - Questions

1. Definition of CRF in Section 3 contains a formula with a shortcut notation: &lt;latex&gt;P(Y_v | X, Y_w, w \neq v) = P(Y_v | X, Y_w, w \sim v)&lt;/latex&gt;.

a) Try to rewrite this general formula using some more clear notation (or explain it in your words).
b) Rewrite the formula for the chain-structured case of CRF.</description>
    </item>
    <item rdf:about="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:entity?rev=1417206684&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2014-11-28T21:31:24+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>courses:rg:2014:entity</title>
        <link>https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:entity?rev=1417206684&amp;do=diff</link>
        <description>Paper

Marco Pennacchiotti, Patrick Pantel: Entity Extraction via Ensemble Semantics, ACL 2009.

Questions

	*  What are “seed instances” good for?
	*  What is the difference between B4 and ES-all (apart from different results)?
	*  
		*  See the formula 1 (the average precision AP(L)). Describe it in words and write a formula for P(i) (corr(i) should be pretty clear from the text).</description>
    </item>
    <item rdf:about="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:kernels?rev=1419932181&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2014-12-30T10:36:21+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>courses:rg:2014:kernels</title>
        <link>https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:kernels?rev=1419932181&amp;do=diff</link>
        <description>Mark Johnson: A brief introduction to kernel classifiers

&lt;http://cs.brown.edu/courses/cs195-5/fall2009/docs/lecture_10-27.pdf&gt;

Question:

Let's have two vectors
a=(a1,a2,a3)
b=(b1,b2,b3)
and a function K: (R^3,R^3) =&gt; R,
K(a,b) = (a*b)^2 = (a1*b1 + a2*b2 + a3*b3)^2
Can you define function h: R^3 =&gt; R^m
such that h(a) * h(b) = K(a,b)?</description>
    </item>
    <item rdf:about="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:mdsm?rev=1417265570&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2014-11-29T13:52:50+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>courses:rg:2014:mdsm</title>
        <link>https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:mdsm?rev=1417265570&amp;do=diff</link>
        <description>You should focus on the first paper (you can skip section 2.3): Distributional semantics from text and images.
The second paper Distributional Semantics in Technicolor, an extent of the first one, is optional reading.

Q1. 
Recall the paper about word representations presented by Tam on November 10.
Read &lt;http://www.quora.com/Whats-the-difference-between-distributed-and-distributional-semantic-representations&gt;

(M_{w,d} is a matrix with w rows and d columns).
What does w, d and k mean?
What are …</description>
    </item>
    <item rdf:about="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:memm?rev=1413813879&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2014-10-20T16:04:39+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>courses:rg:2014:memm</title>
        <link>https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:memm?rev=1413813879&amp;do=diff</link>
        <description>Maximum Entropy Markov Models - Questions

1. Explain (roughly) how the new formula for α_t+1(s) is derived (i.e. formula 1 in the paper).

2. Section 2.1 states “we will split P(s|s',o) into |S| separately trained transition functions”. What are the advantages and disadvantages of this approach?</description>
    </item>
    <item rdf:about="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:perceptron?rev=1413239320&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2014-10-14T00:28:40+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>courses:rg:2014:perceptron</title>
        <link>https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:perceptron?rev=1413239320&amp;do=diff</link>
        <description>Paper

Michael Collins: Discriminative Training Methods for Hidden Markov Models: Theory and Experiments with Perceptron Algorithms, EMNLP 2002. 

Questions

1. Suppose you have a tagset consisting of two tags, N(noun), X(not noun) and a training sentence:
  Luke/N I/X am/X your/X father/N
During the training, this best tag sequence for this sentence is found:</description>
    </item>
    <item rdf:about="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:start?rev=1418031881&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2014-12-08T10:44:41+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>courses:rg:2014:start</title>
        <link>https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:start?rev=1418031881&amp;do=diff</link>
        <description>Semantic Parsing Freebase: Towards Open-domain Semantic Parsing
Questions:

1. What does CCG stand for?

2. (Optional task) Using the following CCG grammar rules and lexicons, try to parse and represent the following sentence:

Sentence for parsing: “Pennsylvania neighbors New York.”</description>
    </item>
    <item rdf:about="https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:wr?rev=1415108508&amp;do=diff">
        <dc:format>text/html</dc:format>
        <dc:date>2014-11-04T14:41:48+00:00</dc:date>
        <dc:creator>Anonymous (anonymous@undisclosed.example.com)</dc:creator>
        <title>courses:rg:2014:wr</title>
        <link>https://wiki.ufal.ms.mff.cuni.cz/courses:rg:2014:wr?rev=1415108508&amp;do=diff</link>
        <description>Word Representations - Questions

1. In three main types of word representations described in the paper, to which types the following two samples belong:
a) 
 dog -0.087099201783 -0.136966257697 0.106813367913 [47 more numbers]
 cat -0.103287428163 -0.0066971301398 -0.0346911076188 [47 more numbers]</description>
    </item>
</rdf:RDF>
