The main goal of my research is to dramatically increase our ability to mine actionable knowledge from unstructured text. I am especially interested in information extraction from the Web, understanding the connections between people and between organizations, expert finding, social network analysis, and mining the scientific literature & community. Toward this end my group develops and employs various methods in statistical machine learning, natural language processing, information retrieval and data mining---tending toward probabilistic approaches and graphical models.
My current research focuses on box embeddings and structured prediction. Broadly speaking, I am interested in information extraction, program synthesis, deep learning and optimization.
My main interest is in semantic representations (of events, frames, semantic roles, etc.). I'm interested both in their annotation and design, how we can learn some of these representations from text, and how to use them in other tasks.
My research focuses on Injecting human knowledge/constraints into neural models, primarily for natural language processing tasks. I am broadly interested in structured prediction, multi-task learning, logical reasoning, and better representation learning for the aforementioned topics.
I am broadly interested in machine learning, natural language processing and knowledge base construction. Areas of interest include common sense knowledge, representation learning, structured prediction, coreference, and optimization.
The focus of my research is on scalable methods for entity resolution, the task of determining the underlying entities referred to by a set of ambiguous mentions. I am interested in two settings of this problem, with and without an existing knowledge base.
Rajarshi's research interest lies in open domain question answering and in reasoning on large knowledge bases and on anything in the interesection of those two.
My current research is on knowledge representation and reasoning. Broadly, I am interested in deep learning and reinforcement learning.
My current research is on Probabilistic Programming Languages. In particular, the relation between programming languages and gradient-based inference.
My current interests include knowledge representation, text entailment, relation extraction, graph embedding, and active learning.
My research focus on structured predictions which involve deep learning and probabilistic graphical models. I am also interested in generative models and tasks concern both language model and vision model.
My research interests involve Natural Language Processing, especially commonsense knowledge reasoning.
My main research interests lie in information extraction, knowledge representation and reasoning, question answering and machine learning.
I am broadly interested in machine learning and natural language processing. A bit more specifically, I am interested in methods for learning representation of entities, structured prediction models and clustering.
I work on safe and interpretable machine learning in healthcare. I'm also interested in general natural language processing applications and speech/conversational AI.
Current interests include neural network models to learn representations of common sense knowledge defined as event frames and event chains.
My current research interests are in automatically generating knowledge bases from raw text and vice versa.
I am interested in time-aware representation learning for both text and KBs. Currently, I working with box embeddings to model relational data.
I am interested in machine learning, natural language processing, learning with limited data, and information extraction.
I am broadly interested in machine learning for NLP with special focus on interpretable and instructable methods that can learn from raw data while being guided/instructed by structured external knowledge/constraints.