Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

One-shot and few-shot learning of word embeddings

About

Standard deep learning systems require thousands or millions of examples to learn a concept, and cannot integrate new concepts easily. By contrast, humans have an incredible ability to do one-shot or few-shot learning. For instance, from just hearing a word used in a sentence, humans can infer a great deal about it, by leveraging what the syntax and semantics of the surrounding words tells us. Here, we draw inspiration from this to highlight a simple technique by which deep recurrent networks can similarly exploit their prior knowledge to learn a useful representation for a new word from little data. This could make natural language processing systems much more flexible, by allowing them to learn continually from the new words they encounter.

Andrew K. Lampinen, James L. McClelland• 2017

Related benchmarks

TaskDatasetResultRank
Biomedical domain adaptationOpen Medical-LLM leaderboard
Macro Average68.9
84
Biomedical domain adaptationOpen Medical-LLM leaderboard macro-average
Macro Average Score68.9
75
Definition GenerationBiomedical domain tokens
Similarity Score60.2
75
Definition GenerationMulti-word tokens famous people, places, entities, sayings and concepts
Correctness71.6
66
Multiple-choice tasksFrenchBench
Accuracy81.2
61
Showing 5 of 5 rows

Other info

Follow for update