Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

TabMT: Generating tabular data with masked transformers

About

Autoregressive and Masked Transformers are incredibly effective as generative models and classifiers. While these models are most prevalent in NLP, they also exhibit strong performance in other domains, such as vision. This work contributes to the exploration of transformer-based models in synthetic data generation for diverse application domains. In this paper, we present TabMT, a novel Masked Transformer design for generating synthetic tabular data. TabMT effectively addresses the unique challenges posed by heterogeneous data fields and is natively able to handle missing data. Our design leverages improved masking techniques to allow for generation and demonstrates state-of-the-art performance from extremely small to extremely large tabular datasets. We evaluate TabMT for privacy-focused applications and find that it is able to generate high quality data with superior privacy tradeoffs.

Manbir S Gulati, Paul F Roysdon• 2023

Related benchmarks

TaskDatasetResultRank
Tabular Classificationdiabetes 37 (test)
Test Error76.9
15
Tabular Data UtilityCalifornia (test)
AUC0.988
14
Tabular Data UtilityDefault (test)
AUC0.714
14
Tabular Data UtilityAdult (test)
AUC0.873
14
Tabular Data UtilityMagic (test)
AUC0.822
14
Tabular Data SynthesisAggregate of five tabular datasets (full train vs original train)
Marginal Error4.46
13
Tabular Data UtilityShoppers (test)
AUC0.912
13
Tabular Data GenerationBU
MLE90.8
6
Tabular ClassificationBU (test)
MLE Score0.908
6
Tabular Data GenerationCH
MLE0.741
6
Showing 10 of 42 rows

Other info

Follow for update