Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

AnglE-optimized Text Embeddings

About

High-quality text embedding is pivotal in improving semantic textual similarity (STS) tasks, which are crucial components in Large Language Model (LLM) applications. However, a common challenge existing text embedding models face is the problem of vanishing gradients, primarily due to their reliance on the cosine function in the optimization objective, which has saturation zones. To address this issue, this paper proposes a novel angle-optimized text embedding model called AnglE. The core idea of AnglE is to introduce angle optimization in a complex space. This novel approach effectively mitigates the adverse effects of the saturation zone in the cosine function, which can impede gradient and hinder optimization processes. To set up a comprehensive STS evaluation, we experimented on existing short-text STS datasets and a newly collected long-text STS dataset from GitHub Issues. Furthermore, we examine domain-specific STS scenarios with limited labeled data and explore how AnglE works with LLM-annotated data. Extensive experiments were conducted on various tasks including short-text STS, long-text STS, and domain-specific STS tasks. The results show that AnglE outperforms the state-of-the-art (SOTA) STS models that ignore the cosine saturation zone. These findings demonstrate the ability of AnglE to generate high-quality text embeddings and the usefulness of angle optimization in STS.

Xianming Li, Jing Li• 2023

Related benchmarks

TaskDatasetResultRank
Sentence Classification Transfer TasksSentEval transfer tasks
Average Accuracy0.9138
99
Semantic Textual SimilaritySTS (Semantic Textual Similarity) 2012-2016 (test)
STS-12 Score79
57
End-to-end Question AnsweringBird
Accuracy17.5
6
End-to-end Question AnsweringOTT-QA
Accuracy0.297
6
Transfer Learning EvaluationSTS Transfer Robustness (test val)
MRPC62.2
4
Showing 5 of 5 rows

Other info

Code

Follow for update