Research Areas

Bayesian Models for Text
Anton Bakalov and Daniel Duckworth 

We are currently working on creating probabilistic generative models that capture fine-grained topics.

Fast, Accurate Dependency Parsing
Jinho Choi, Brian Martin, David Belanger

Dependency parsing is a lightweight method for gathering useful syntactic information about text. Transition-based approaches are extremely fast and yield competitive accuracy. We are exploring methods for further improving these algorithms.

Fast MAP Inference Using LP Relaxations
Alexandre Passos and David Belanger

By formulating MAP inference as a linear program and using techniques for high-level reasoning about the LP, such as cutting planes and column generation, we are able to provide efficient inference algorithms with optimality guarantees. We exploit the connections between LPs, message passing, and shortest-path problems.

Large-Scale Entity Resolution
Karl Schultz and Harshal Pandya

Entity resolution is the task of clustering mentions into partitions representing their underlying real-world entities. We are currently exploring efficient ways to do entity resolution on a scale of several million mentions.
Our efforts are also directed towards devising methods for automatic knowledge base construction.