EXL3 quantization of reka-flash-3, 3 bits per weight.
HumanEval (argmax)
Model | Q4 | Q8 | FP16 |
---|---|---|---|
reka-flash-3-exl3-3bpw | 87.8 | 90.2 | 90.9 |
reka-flash-3-exl3-4bpw | 89.0 | 88.4 | 87.2 |
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for isogen/reka-flash-3-exl3-3bpw
Base model
RekaAI/reka-flash-3