[ Skip to the content ]

Institute of Formal and Applied Linguistics Wiki


[ Back to the navigation ]

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Last revision Both sides next revision
courses:rg:natural-logic-for-textual-inference [2011/05/03 01:49]
gutierrez
courses:rg:natural-logic-for-textual-inference [2011/05/03 09:41]
popel typos
Line 1: Line 1:
 +====== Natural Logic for Textual Inference ======
 +**Bill MacCartney, Christopher D. Manning (2007)**
 +
 ===== Introduction ===== ===== Introduction =====
  
-This paper deals with “**natural logic**” which is logical inference that operates over natural language. Usually the approaches for natural language inference are whether robust but shallow or deep but brittle. The system proposed in this paper aims to be in the middle of the existent approaches and avoid, for instance, the error when translating a natural language to 1st order logic. +This paper deals with “**natural logic**” which is logical inference that operates over natural language. Usually the approaches for natural language inference are either robust but shallow or deep but brittle. The system proposed in this paper aims to be in the middle of the existent approaches and avoids, for instance, the error when translating a natural language to 1st order logic. 
-quantifiers, + 
-One key concept in the theory of natural logic is “monoticity” in which, instead of using quantifiers,  the concepts or constraints expressed in a sentence are expanded or contracted.  This way,  linguistics expressions can be represented as //upward-monotone//, //downward monotone//, or //non-monotone// semantic functions.+One key concept in the theory of natural logic is “monotonicity” in which, instead of using quantifiers,  the concepts or constraints expressed in a sentence are expanded or contracted.  This way,  linguistics expressions can be represented as //upward-monotone//, //downward monotone//, or //non-monotone// semantic functions.
  
-The developed system is called **NatLog** and has an architecture with three main stages: Linguistic preprocessing (parse input sentences, monotonicity marking) Alignment (alignment between the premise and the hypothesis  with atomic edits) and Entailment classification (entailment relation for each edit based solely on lexical  features, independent of context).+The developed system is called **NatLog** and has an architecture with three main stages: Linguistic preprocessing (parse input sentences, monotonicity marking) Alignment (alignment between the premise and the hypothesis  with atomic edits) and Entailment classification (entailment relation for each edit based solely on lexical features, independent of context).
  
  
Line 22: Line 25:
   *It is good that the examples in test data contain 3 different answers.   *It is good that the examples in test data contain 3 different answers.
   *Disadvantage: If you combine a lot of atomic edits, the probability of getting the right answer is lower.   *Disadvantage: If you combine a lot of atomic edits, the probability of getting the right answer is lower.
-  *We liked the evaluation presented in the paper and the result'interpretation (that in not very usual in semantics). Also it was good the fact that they digitalized a textbook of formal semantics to build the testsuit FraCaS +  *We liked the evaluation presented in the paper and the results interpretation (that is not very usual in semantics). Also it was good the fact that they digitalized a textbook of formal semantics to build the testsuit FraCaS 
  
  
  
-Comments by Ximena Gutiérrez.+Written by Ximena Gutiérrez.

[ Back to the navigation ] [ Back to the content ]