StarSpace: Embed All The Things!
About
We present StarSpace, a general-purpose neural embedding model that can solve a wide variety of problems: labeling tasks such as text classification, ranking tasks such as information retrieval/web search, collaborative filtering-based or content-based recommendation, embedding of multi-relational graphs, and learning word, sentence or document level embeddings. In each case the model works by embedding those entities comprised of discrete features and comparing them against each other -- learning similarities dependent on the task. Empirical results on a number of tasks show that StarSpace is highly competitive with existing methods, whilst also being generally applicable to new cases where those methods are not.
Ledell Wu, Adam Fisch, Sumit Chopra, Keith Adams, Antoine Bordes, Jason Weston• 2017
Related benchmarks
| Task | Dataset | Result | Rank | |
|---|---|---|---|---|
| Link Prediction | FB15K (test) | Hits@100.838 | 164 | |
| Dialog utterance prediction | PERSONA-CHAT No Persona v1 | Hits@10.318 | 6 | |
| Dialog utterance prediction | PERSONA-CHAT Original v1 | Hits@149.1 | 6 | |
| Dialog utterance prediction | PERSONA-CHAT Revised v1 | Hits@10.322 | 6 | |
| Information Retrieval | Wikipedia IR (test) | Recall@1 (Top 10001)56.8 | 5 |
Showing 5 of 5 rows