Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

NRGPT: An Energy-based Alternative for GPT

About

Generative Pre-trained Transformer (GPT) architectures are the most popular design for language modeling. Energy-based modeling is a different paradigm that views inference as a dynamical process operating on an energy landscape. We propose a minimal modification of the GPT setting to unify it with the EBM framework. The inference step of our model, which we call eNeRgy-GPT (NRGPT), is conceptualized as an exploration of the tokens on the energy landscape. We prove, and verify empirically, that under certain circumstances this exploration becomes gradient descent, although they don't necessarily lead to the best performing models. We demonstrate that our model performs well for simple language (Shakespeare dataset), algebraic ListOPS tasks, and richer settings such as OpenWebText language modeling. We also observe that our models may be more resistant to overfitting, doing so only during very long training.

Nima Dehmamy, Benjamin Hoover, Bishwajit Saha, Leo Kozachkov, Jean-Jacques Slotine, Dmitry Krotov• 2025

Related benchmarks

TaskDatasetResultRank
Language ModelingOpenWebText (val)--
70
Language UnderstandingMMLU
Medicine Accuracy30.5
17
Language ModelingOpenWebText (OWT) (val)
Perplexity104
12
Language ModelingOpenWebText (train)
Train Loss3.391
11
Language ModelingOpenWebText (OWT) (test)--
7
Language ModelingShakespeare
Perplexity283
5
Showing 6 of 6 rows

Other info

Follow for update