Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

CONE: Embeddings for Complex Numerical Data Preserving Unit and Variable Semantics

About

Large pre-trained models (LMs) and Large Language Models (LLMs) are typically effective at capturing language semantics and contextual relationships. However, these models encounter challenges in maintaining optimal performance on tasks involving numbers. Blindly treating numerical or structured data as terms is inadequate -- their semantics must be well understood and encoded by the models. In this paper, we propose CONE, a hybrid transformer encoder pre-trained model that encodes numbers, ranges, and gaussians into an embedding vector space preserving distance. We introduce a novel composite embedding construction algorithm that integrates numerical values, ranges or gaussians together with their associated units and attribute names to precisely capture their intricate semantics. We conduct extensive experimental evaluation on large-scale datasets across diverse domains (web, medical, finance, and government) that justifies CONE's strong numerical reasoning capabilities, achieving an F1 score of 87.28% on DROP, a remarkable improvement of up to 9.37% in F1 over state-of-the-art (SOTA) baselines, and outperforming major SOTA models with a significant Recall@10 gain of up to 25%.

Gyanendra Shrestha, Anna Pyayt, Michael Gubanov• 2026

Related benchmarks

TaskDatasetResultRank
Column matchingCancerKG
Recall@1083.3
10
Column matchingCovidKG
Recall@1080
10
Column matchingWebtable
Recall@1095
10
Column matchingCIUS
Recall@1090
10
Column matchingSAUS
Recall@1090
10
Tuple MatchingCancerKG
Recall@1086.7
10
Tuple MatchingCovidKG
Recall@1080
10
Tuple MatchingWebtable
Recall@1090
10
Tuple MatchingCIUS
Recall@1090
10
Tuple MatchingSAUS
Recall@1085
10
Showing 10 of 21 rows

Other info

Follow for update