Use the latest transformers for this model.

Downloads last month
7
Safetensors
Model size
4.06B params
Tensor type
FP16
·
I64
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for brunopio/deepseek-math-7b-instruct-nbits4-GS64-Axis1-HQQ-T

Quantized
(16)
this model