AI

Tuning LLMs with Contrastive Alignment Instructions for Machine Translation in Unseen, Low-resource Languages

1 Mins read

Multi-lingual language models (LM), such as mBERT, XLM-R, mT5, mBART, have been remarkably successful in enabling natural language tasks in low-resource languages through cross-lingual transfer from high-resource ones. In this work, we try to better understand how such models, specifically mT5, transfer any linguistic and semantic knowledge across languages, even though no explicit cross-lingual signals are provided during pre-training. Rather…

See paper details


Source link

Related posts
AI

New AI JetPack accelerates the entrepreneurial process | MIT News

2 Mins read
Apple co-founder Steve Jobs described the computer as a bicycle for the mind. What the Martin Trust Center for MIT Entrepreneurship just launched has a…
AI

AI model can reveal the structures of crystalline materials | MIT News

3 Mins read
For more than 100 years, scientists have been using X-ray crystallography to determine the structure of crystalline materials such as metals, rocks,…
AI

Jina-Embeddings-v3 Released: A Multilingual Multi-Task Text Embedding Model Designed for a Variety of NLP Applications

4 Mins read
Text embedding models have become foundational in natural language processing (NLP). These models convert text into high-dimensional vectors that capture semantic relationships,…

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *