AI

Approximate Nearest Neighbour Phrase Mining for Contextual Speech Recognition

1 Mins read

This paper presents an extension to train end-to-end Context-Aware Transformer Transducer ( CATT ) models by using a simple, yet efficient method of mining hard negative phrases from the latent space of the context encoder. During training, given a reference query, we mine a number of similar phrases using approximate nearest neighbour search. These sampled phrases are then used as negative examples in the context list alongside random and ground truth contextual information. By including approximate nearest neighbour phrases (ANN-P) in the context list, we encourage the learned representation to disambiguate between similar, but not identical, biasing phrases. This improves biasing accuracy when there are several similar phrases in the biasing inventory. We carry out experiments in a large-scale data regime obtaining up to 7% relative word error rate reductions for the contextual portion of test data. We also extend and evaluate CATT approach in streaming applications.


Source link

Related posts
AI

Do Compressed LLMs Forget Knowledge? An Experimental Study with Practical Implications

1 Mins read
This paper was accepted at the Machine Learning and Compression Workshop at NeurIPS 2024. Compressing Large Language Models (LLMs) often leads to…
AI

Instance-Optimal Private Density Estimation in the Wasserstein Distance

1 Mins read
Estimating the density of a distribution from samples is a fundamental problem in statistics. In many practical settings, the Wasserstein distance is…
AI

Private Stochastic Convex Optimization with Heavy Tails: Near-Optimality from Simple Reductions

1 Mins read
We study the problem of differentially private stochastic convex optimization (DP-SCO) with heavy-tailed gradients, where we assume a kthk^{\text{th}}kth-moment bound on the…

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *