Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer
Downloads last month
14
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for VoidStare/L3.3-Mokume-Gane-R1-70b-v1.1-EXL2-6.5bpw-h8

Quantized
(22)
this model