Use the latest transformers for this model.

Downloads last month
2
Safetensors
Model size
5.28B params
Tensor type
I64
FP16
U8
Inference Providers NEW
This model isn't deployed by any Inference Provider. 馃檵 Ask for provider support

Model tree for brunopio/codegeex4-all-9b-nbits4-GS64-Axis1-HQQ-T

Quantized
(6)
this model