AI

Time Sensitive Knowledge Editing through Efficient Finetuning

1 Mins read

Large Language Models (LLMs) have demonstrated impressive capability in different tasks and are bringing transformative changes to many domains. However, keeping the knowledge in LLMs up-to-date remains a challenge once pretraining is complete. It is thus essential to design effective methods to both update obsolete knowledge and induce new knowledge into LLMs. Existing locate-and-edit knowledge editing (KE) method suffers from two limitations. First, the post-edit LLMs by such methods generally have poor capability in answering complex queries that require multi-hop reasoning. Second, the long run-time of such locate-and-edit methods to perform knowledge edits make it infeasible for large scale KE in practice. In this paper, we explore Parameter-Efficient Fine-Tuning (PEFT) techniques as an alternative for KE. We curate a more comprehensive temporal KE dataset with both knowledge update and knowledge injection examples for KE performance benchmarking. We further probe the effect of fine-tuning on a range of layers in an LLM for the multi-hop QA task. We find that PEFT performs better than locate-and-edit techniques for time-sensitive knowledge edits.


Source link

Related posts
AI

Researchers from Stanford and Cornell Introduce APRICOT: A Novel AI Approach that Merges LLM-based Bayesian Active Preference Learning with Constraint-Aware Task Planning

3 Mins read
In the rapidly evolving field of household robotics, a significant challenge has emerged in executing personalized organizational tasks, such as arranging groceries…
AI

Arcee AI Releases Arcee-VyLinh: A Powerful 3B Vietnamese Small Language Model

3 Mins read
AI’s rapid rise has been driven by powerful language models, transforming industries from customer service to content creation. However, many languages, particularly…
AI

Amazon Bedrock Prompt Management is now available in GA

4 Mins read
Today we are announcing the general availability of Amazon Bedrock Prompt Management, with new features that provide enhanced options for configuring your…

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *