Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Converting Transformers into DGNNs Form

About

Recent advances in deep learning have established Transformer architectures as the predominant modeling paradigm. Central to the success of Transformers is the self-attention mechanism, which scores the similarity between query and key matrices to modulate a value matrix. This operation bears striking similarities to digraph convolution, prompting an investigation into whether digraph convolution could serve as an alternative to self-attention. In this study, we formalize this concept by introducing a synthetic unitary digraph convolution based on the digraph Fourier transform. The resulting model, which we term Converter, effectively converts a Transformer into a Directed Graph Neural Network (DGNN) form. We have tested Converter on Long-Range Arena benchmark, long document classification, and DNA sequence-based taxonomy classification. Our experimental results demonstrate that Converter achieves superior performance while maintaining computational efficiency and architectural simplicity, which establishes it as a lightweight yet powerful Transformer variant.

Jie Zhang, Mao-Hsuan Mao, Bo-Wei Chiu, Min-Te Sun• 2025

Related benchmarks

TaskDatasetResultRank
Long-sequence modelingLong Range Arena (LRA) v1 (test)
ListOps60.38
66
DNA Sequence-based Taxonomy ClassificationEnsembl (B/S) 3 (test)
Accuracy84.59
9
DNA Sequence-based Taxonomy ClassificationEnsembl (M/R) 3 (test)
Accuracy59.49
9
Long Document ClassificationLongDoc16K (test)
Accuracy81.77
9
Long Document ClassificationLongDoc 32K (test)
Accuracy82.34
9
Showing 5 of 5 rows

Other info

Follow for update