Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

RETVec: Resilient and Efficient Text Vectorizer

About

This paper describes RETVec, an efficient, resilient, and multilingual text vectorizer designed for neural-based text processing. RETVec combines a novel character encoding with an optional small embedding model to embed words into a 256-dimensional vector space. The RETVec embedding model is pre-trained using pair-wise metric learning to be robust against typos and character-level adversarial attacks. In this paper, we evaluate and compare RETVec to state-of-the-art vectorizers and word embeddings on popular model architectures and datasets. These comparisons demonstrate that RETVec leads to competitive, multilingual models that are significantly more resilient to typos and adversarial text attacks. RETVec is available under the Apache 2 license at https://github.com/google-research/retvec.

Elie Bursztein, Marina Zhang, Owen Vallis, Xinyu Jia, Alexey Kurakin• 2023

Related benchmarks

TaskDatasetResultRank
Natural Language UnderstandingGLUE (dev)
SST-2 (Acc)91.5
504
Natural Language UnderstandingGLUE
SST-291.5
452
Text ClassificationAG News (test)
Accuracy93.5
210
Text ClassificationYelp P. (test)
Accuracy94.7
34
Multiclass text classificationMultilingual Amazon Reviews Corpus (test)
Accuracy (Avg)93.5
24
Text ClassificationMASSIVE (test)
Accuracy78.6
18
Text ClassificationAverage All Datasets
Accuracy89.4
18
Text ClassificationAG News 1000 examples (test)
Accuracy (Original)93.7
6
Showing 8 of 8 rows

Other info

Code

Follow for update