[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki


[ Back to the navigation ]

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
courses:rg:2013:convolution-kernels [2013/03/11 18:54]
dusek
courses:rg:2013:convolution-kernels [2013/03/12 11:27] (current)
popel <latex>x</latex> was not rendered
Line 19: Line 19:
       * They are able to "generate" fake inputs, but this feat is not used very often.       * They are able to "generate" fake inputs, but this feat is not used very often.
       * Examples: Naive Bayes, Mixtures of Gaussians, HMM, Bayesian Networks, Markov Random Fields       * Examples: Naive Bayes, Mixtures of Gaussians, HMM, Bayesian Networks, Markov Random Fields
-    * **Discriminative models** do everything in one-step -- they learn the posterior <latex>P(y|x)</latex> as a function of some features of <latex>x</latex>.+    * **Discriminative models** do everything in one-step -- they learn the posterior <latex>P(y|x)</latex> as a function of some features of <latex> x</latex>.
       * They are simpler and can use many more features, but are prone to missing inputs.       * They are simpler and can use many more features, but are prone to missing inputs.
-      * Examples: SVM, Logistic Regression, Neuron. sítě, k-NN, Conditional Random Fields+      * Examples: SVM, Logistic Regression, Neural network, k-NN, Conditional Random Fields
   - Each CFG rule generates just one level of the derivation tree. Therefore, using "standard" nonterminals, it is not possible to generate e.g. this sentence:   - Each CFG rule generates just one level of the derivation tree. Therefore, using "standard" nonterminals, it is not possible to generate e.g. this sentence:
     * ''(S (NP (PRP He)) (VP (VBD saw)(NP (PRP himself))))''     * ''(S (NP (PRP He)) (VP (VBD saw)(NP (PRP himself))))''

[ Back to the navigation ] [ Back to the content ]