Share your thoughts, 1 month free Claude Pro on us
See more
Home
/
Benchmarks
Fine-tuning Resource Usage on LLaMA-2-7B
Loading...
7.42
Weights Memory Usage
QFT
6.7128
11.4864
16.26
21.0336
Oct 11, 2023
Weights Memory Usage
Gradients Memory Usage
Weight Copies Memory Usage
Momentum State Memory Usage
Variance State Memory Usage
Activation Memory Usage
Total Memory Usage
Peak Memory Usage
Updated 1mo ago
Evaluation Results
Method
Method
Links
Weights Memory Usage
Gradients Memory Usage
Weight Copies Memory Usage
Momentum State Memory Usage
Variance State Memory Usage
Activation Memory Usage
Total Memory Usage
Peak Memory Usage
QFT
2023.10
7.42
7.06
-
7.06
-
3.75
25.3
28.9
Adam-FP16 mixed
Precision=FP16 mixed
2023.10
12.6
12.6
25.1
25.1
25.1
3.75
104
123
Adam-FP32
Precision=FP32
2023.10
25.1
25.1
-
25.1
25.1
3.75
104
129
Bitsandbytes
2023.10
25.1
25.1
-
6.31
6.31
3.75
66.6
86.6
Lion-FP32
Precision=FP32
2023.10
25.1
25.1
-
25.1
-
3.75
79.1
101
Feedback
Search any
task
Search any
task