Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

TabTransformer: Tabular Data Modeling Using Contextual Embeddings

About

We propose TabTransformer, a novel deep tabular data modeling architecture for supervised and semi-supervised learning. The TabTransformer is built upon self-attention based Transformers. The Transformer layers transform the embeddings of categorical features into robust contextual embeddings to achieve higher prediction accuracy. Through extensive experiments on fifteen publicly available datasets, we show that the TabTransformer outperforms the state-of-the-art deep learning methods for tabular data by at least 1.0% on mean AUC, and matches the performance of tree-based ensemble models. Furthermore, we demonstrate that the contextual embeddings learned from TabTransformer are highly robust against both missing and noisy data features, and provide better interpretability. Lastly, for the semi-supervised setting we develop an unsupervised pre-training procedure to learn data-driven contextual embeddings, resulting in an average 2.1% AUC lift over the state-of-the-art methods.

Xin Huang, Ashish Khetan, Milan Cvitkovic, Zohar Karnin• 2020

Related benchmarks

TaskDatasetResultRank
Binary Classificationcylinder-bands (CB) (test)
AUROC0.855
40
Binary Classificationdresses-sales (DS) (test)
AUROC64.8
40
Binary Classificationincome IC 1995 (test)
AUROC0.882
39
Credit approval predictionCredit Approval dataset (test)
AUROC0.86
37
Binary Classificationadult (AD) (test)
AUROC0.914
32
Tabular ClassificationAdult (test)
AUROC91.4
28
Binary Classificationinsurance-co IO (test)
AUROC0.794
27
Binary Classificationcredit-g (CG) (test)
AUROC71.8
27
Binary Classificationblastchar (BL) (test)
AUROC0.82
27
Mortality PredictionN00079274 (test)
AUROC0.7178
24
Showing 10 of 39 rows

Other info

Follow for update