| Task Name | Dataset Name | SOTA Result | Trend | |
|---|---|---|---|---|
| Training Throughput | GPT2-1.5B | Throughput4.1 | 25 | |
| Training Data Attribution | GPT2-small | LDS Score0.3936 | 10 | |
| Output-based feature description faithfulness | GPT2 MLP SAE | Faithfulness Score40.9 | 8 | |
| Input-based feature description faithfulness | GPT2 MLP SAE | Faithfulness Score51.2 | 8 | |
| Output-based feature description faithfulness | GPT2 Res. SAE | Faithfulness Score47.2 | 8 | |
| Input-based feature description faithfulness | GPT2 Res. SAE | Faithfulness Score60.4 | 8 | |
| Private text generation | GPT2-base (124M) | Usage Fraction100 | 7 | |
| Private Inference | GPT2-base (124M) | Embed Inference Time (s)5.17 | 7 | |
| Adversarial Attack | GPT2 F.t. | ASR (%)74.25 | 6 |