Learning Semantics Workshop at NIPS 2011 Invited Talk: Learning Dependency-Based Compositional Semantics by Percy Liang Percy Liang received his Ph.D. from Berkeley and is a post-doc at Google and Assistant Professor at Stanford. He works on methods that infer representations of meaning from sentences given limited supervision, approximate inference algorithms, and methods that learn from partial labels and that can share statistical strength across multiple related learning problems. Abstract: The semantics of natural language has a highly-structured logical aspect. For example, the meaning of the question "What is the third tallest mountain in a state not bordering California?" involves superlatives, quantification, and negation. In this talk, we develop a new representation of semantics called Dependency-Based Compositional Semantics (DCS) which can represent these complex phenomena in natural language. At the same time, we show that we can treat the DCS structure as a latent variable and learn it automatically from question/answer pairs. This allows us to build a compositional question-answering system that obtains state-of-the-art accuracies despite using less supervision than previous methods. I will conclude the talk with extensions to handle contextual effects in language.
Get notified about new features and conference additions.