[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki

[ Back to the navigation ]


This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
courses:rg [2014/11/10 16:44]
courses:rg [2022/02/08 00:40] (current)
rosa +phd rg
Line 1: Line 1:
 ~~NOTOC~~ ~~NOTOC~~
-===== Reading Group ===== 
-Official name of this course is [[https://is.cuni.cz/studium/predmety/index.php?do=predmet&kod=NPFL095|NPFL095]] **Modern Methods in Computational Linguistics**. It is a continuation of informal Reading Group (RG) meetings. Requirements for getting credits:  
-  * presenting one paper, 
-    * Select a term (write your name to the schedule below) before October 13. 
-    * If no paper is assigned to the term, suggest [[mailto:popel@ufal.mff.cuni.cz|me]] 2--3 papers you would like to present (with pdf links, and your preferences) before October 20. Ideally, make a group of 2--4 students presenting papers on a common topic (starting from basics to more advance papers). 
-    * Prepare your presentation and 3--5 quiz questions. At least 3 of the questions should ask for a specific answer, e.g. "write an equation for...", "given training set X=([dog,N],[cat,Y]), what is the number..." (Not "what do you think about..."). The first question should be quite easy to answer for those who have read the whole paper. The last question may be a tricky one. Send me the questions two weeks before your presentation. We may discuss the paper and refine the questions. 
-    * One week before the presentation, write the questions to a dedicated wiki page here. Send a reminder (questions and a link to the pdf of the paper) to rg@ufal.mff.cuni.cz by Monday 15:45. 
-  active participation in the discussions, which is conditioned by reading the papers in advance and attending the meetings, +===== Reading Group for Master students =====   
-  sending your answers to me and the presenter by Saturday 23:59 (so the presenter can go through all answers before the presentation and focus more on problematic parts). +Official name of this course is [[https://is.cuni.cz/studium/predmety/index.php?do=predmet&kod=NPFL095|NPFL095]] **Modern Methods in Computational Linguistics**. It is a continuation of informal Reading Group (RGmeetings.
-  * In case of more than three missed meetings or deadlines, additional work (e.g. reports or answers to tricky questionswill be required.+
-All questionsreports and presented papers must be in EnglishThe presentations are in English by defaultbut if all present people agree it may be in Czech.+Since 2016the wiki is moved to https://github.com/ufal/NPFL095/wiki and the mailing list to [[https://groups.google.com/forum/#!forum/npfl095|npfl095@googlegroups.com]]. 
 +See also [[courses:rg:past|an overview of past meetings]][[courses:rg:wishlist|an outdated wishlist]] and [[https://github.com/ufal/rg/wiki|Machine Learning RG (active in 2014)]].
-^ Contact      | popel@ufal.mff.cuni.cz | +===== Reading Group for PhD students ===== 
-^ Mailing list | rg@ufal.mff.cuni.cz     | + 
-^ Meetings     | Mondays 16:00, room S1 | +See the [[https://ufal.mff.cuni.cz/courses/rg/|website of PhD reading group]] (related also to a previous reading group called Deep Learning Seminar originally led by Milan Straka).
-^ Past meetings| [[courses:rg:past|courses:rg:past]] | +
-^ Inspiration  | [[courses:rg:wishlist|courses:rg:wishlist]] | +
-^ Other reading groups [[https://github.com/ufal/rg/wiki|Machine Learning RG]] |+
-=== Autumn&Winter 2014/2015 === 
-^ date   | **speaker**  | **paper** | 
-^ Oct  6 |              | startup meeting | 
-^ Oct 13 | Jindřich Libovický | Peter F. Brown et all.: [[http://www.aclweb.org/anthology/J92-4003|Class-Based n-gram Models of Natural Language]], Computational Linguistics, 1992. See also [[http://statmt.blogspot.cz/2014/07/understanding-mkcls.html| notes about the mkcls implementation]] | 
-^ Oct 20 | Tomáš Kraut | Michael Collins: [[http://ucrel.lancs.ac.uk/acl/W/W02/W02-1001.pdf|Discriminative Training Methods for Hidden Markov Models: Theory and Experiments with Perceptron Algorithms]], EMNLP 2002. [[courses:rg:2014:perceptron|Questions]]| 
-^ Oct 27  | Roman Sudarikov | Andrew McCallum, Dayne Freitag, Fernando Pereira: [[http://www.ai.mit.edu/courses/6.891-nlp/READINGS/maxent.pdf|Maximum Entropy Markov Models for Information Extraction and Segmentation]], Conference on Machine Learning 2000, [[http://courses.ischool.berkeley.edu/i290-dm/s11/SECURE/gidofalvi.pdf|slides]] [[courses:rg:2014:memm|Question]]| 
-^ Nov 3 | Dušan Variš | John Lafferty, Andrew McCallum, Fernando Pereira: [[http://www.cs.utah.edu/~piyush/teaching/crf.pdf|Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data]], 2001. [[courses:rg:2014:crf|Questions]] | 
-^ Nov 10 | Duc Tam Hoang | Joseph Turian, Lev Ratinov, Yoshua Bengio: [[http://anthology.aclweb.org//P/P10/P10-1040.pdf|Word representations: A simple and general method for semi-supervised learning]], ACL 2010. [[courses:rg:2014:wr|Questions]]| 
-^ <del>Nov 17</del> | --- | no RG (Struggle for Freedom and Democracy Day) | 
-^ Nov 24 | Vendula Michlíková | Kishore Papineni, Salim Roukos, Todd Ward, and Wei-Jing Zhu: [[http://aclweb.org/anthology-new/P/P02/P02-1040.pdf|BLEU: a Method for Automatic Evaluation of Machine Translation]], ACL 2002 | 
-^ Dec 1  | Richard Ejem | Marco Pennacchiotti, Patrick Pantel: [[http://www.aclweb.org/anthology/D09-1025|Entity Extraction via Ensemble Semantics]], ACL 2009. | 
-^ Dec 8  | Nguyen Tien Dat| Elia Bruni and Marco Baroni: [[http://http://www.aclweb.org/anthology/W11-2503|Distributional semantics from text and images]], EMNLP 2010 : Distributional Semantics in Technicolor, ACL 2012| 
-^ Dec 15 |Ahmad Aghaebrahimian |Yoav Goldberg, Michael Elhadad: [[http://aclweb.org/anthology/P/P08/P08-2060.pdf|splitSVM: Fast, Space Efficient, non-Heuristic, Polynomial Kernel Computation for NLP Applications]] ACL 2008| 
-^ Jan 5  | Michal Auersperger | last RG | 

[ Back to the navigation ] [ Back to the content ]