Use the latest transformers for this model.

Downloads last month
2
Safetensors
Model size
1.16B params
Tensor type
I64
·
FP16
·
U8
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for brunopio/DeepSeek-R1-Distill-Qwen-1.5B-nbits4-GS64-Axis1-HQQ-T

Quantized
(184)
this model