AI

MARRS: Multimodal Reference Resolution System

1 Mins read

*= All authors listed contributed equally to this work

Successfully handling context is essential for any dialog understanding task. This context maybe be conversational (relying on previous user queries or system responses), visual (relying on what the user sees, for example, on their screen), or background (based on signals such as a ringing alarm or playing music). In this work, we present an overview of MARRS, or Multimodal Reference Resolution System, an on-device framework within a Natural Language Understanding system, responsible for handling conversational, visual and background context. In particular, we present different machine learning models to enable handing contextual queries; specifically, one to enable reference resolution, and one to handle context via query rewriting. We also describe how these models complement each other to form a unified, coherent, lightweight system that can understand context while preserving user privacy.


Source link

Related posts
AI

Researchers from MIT and ETH Zurich Developed a Machine-Learning Technique for Enhanced Mixed Integer Linear Programs (MILP) Solving Through Dynamic Separator Selection

3 Mins read
Efficiently tackling complex optimization problems, ranging from global package routing to power grid management, has been a persistent challenge. Traditional methods, notably…
AI

This AI Research Unveils Alpha-CLIP: Elevating Multimodal Image Analysis with Targeted Attention and Enhanced Control"

3 Mins read
How can we improve CLIP for more focused and controlled image understanding and editing? Researchers from Shanghai Jiao Tong University, Fudan University,…
AI

Researchers from AI2 and the University of Washington Uncover the Superficial Nature of Alignment in LLMs and Introduce URIAL: A Novel Tuning-Free Method

3 Mins read
Large Language Models (LLMs) are recent innovations in the field of Artificial Intelligence (AI) and Deep Learning. Some of the well-known LLMs,…

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *