Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

BERT4ETH: A Pre-trained Transformer for Ethereum Fraud Detection

About

As various forms of fraud proliferate on Ethereum, it is imperative to safeguard against these malicious activities to protect susceptible users from being victimized. While current studies solely rely on graph-based fraud detection approaches, it is argued that they may not be well-suited for dealing with highly repetitive, skew-distributed and heterogeneous Ethereum transactions. To address these challenges, we propose BERT4ETH, a universal pre-trained Transformer encoder that serves as an account representation extractor for detecting various fraud behaviors on Ethereum. BERT4ETH features the superior modeling capability of Transformer to capture the dynamic sequential patterns inherent in Ethereum transactions, and addresses the challenges of pre-training a BERT model for Ethereum with three practical and effective strategies, namely repetitiveness reduction, skew alleviation and heterogeneity modeling. Our empirical evaluation demonstrates that BERT4ETH outperforms state-of-the-art methods with significant enhancements in terms of the phishing account detection and de-anonymization tasks. The code for BERT4ETH is available at: https://github.com/git-disl/BERT4ETH.

Sihao Hu, Zhen Zhang, Bingqiao Luo, Shengliang Lu, Bingsheng He, Ling Liu• 2023

Related benchmarks

TaskDatasetResultRank
Fraud DetectionEthereum
Precision81.65
21
Account DetectionBitcoin-L (test)
Precision84.68
17
Account DetectionBitcoin-M (test)
Precision80.03
17
Phishing Node DetectionPhishing Node Detection Dataset
Precision0.81
6
Showing 4 of 4 rows

Other info

Follow for update