[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki


[ Back to the navigation ]

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
courses:rg:maxent-lm [2011/04/13 14:05]
popel typo
courses:rg:maxent-lm [2013/04/27 22:52] (current)
popel typo
Line 10: Line 10:
   * Mějme diskrétní prostor jevů X a dvě pravděpodobností rozdělení <​latex>​P,​ Q : X \rightarrow [0,​1]</​latex>​. Pak   * Mějme diskrétní prostor jevů X a dvě pravděpodobností rozdělení <​latex>​P,​ Q : X \rightarrow [0,​1]</​latex>​. Pak
     * Entropy <​latex>​H(P) = - \sum_{x \in X} P(x)\cdot \log_2 P(x)</​latex>​     * Entropy <​latex>​H(P) = - \sum_{x \in X} P(x)\cdot \log_2 P(x)</​latex>​
-    * Perplexity <​latex>​PPL(P) = 2^H(P)</​latex>​+    * Perplexity <​latex>​PPL(P) = 2^{H(P)}</​latex>​
     * Cross-entropy <​latex>​H(P,​ Q) = - \sum_{x \in X} P(x) \cdot \log_2 Q(x)</​latex>​     * Cross-entropy <​latex>​H(P,​ Q) = - \sum_{x \in X} P(x) \cdot \log_2 Q(x)</​latex>​
     * Kullback-Leibler divergence <​latex>​D_{KL}(P || Q) = \sum_{x \in X} P(x)\cdot\log_2(\frac{P(x)}{Q(x)})</​latex>​     * Kullback-Leibler divergence <​latex>​D_{KL}(P || Q) = \sum_{x \in X} P(x)\cdot\log_2(\frac{P(x)}{Q(x)})</​latex>​

[ Back to the navigation ] [ Back to the content ]