Our new X account is live! Follow @wizwand_team for updates
WorkDL logo mark

Attention Operator Throughput on Llama2 7B (32 Q-heads/32 KV-heads/128 Head-dimension)

207.3Attention TFLOPS

flash-attn v2

3.04456.072109.1162.128Jun 14, 2025
Updated 4d ago

Evaluation Results

MethodLinks
2025.06
207.3
2025.06
202.7
2025.06
201.5
2025.06
201.2
2025.06
198.3
2025.06
197.2
2025.06
188.3
2025.06
186.7
2025.06
180.3
2025.06
176.8
2025.06
173.4
2025.06
167.1
2025.06
164.1
2025.06
160.6
2025.06
158.4
2025.06
152.5
2025.06
150.2
2025.06
142.6
2025.06
137.1
2025.06
128.6
2025.06
122.5
2025.06
112.4
2025.06
108.6
2025.06
82.8
2025.06
15.1
2025.06
15.1
2025.06
14.6
2025.06
14.6
2025.06
13.3
2025.06
10.9