IQ2_S quant done off an imatrix from a Q4_K quant because I can't run any higher on my potato PC. Use at your own risk.
- Downloads last month
- 0
Hardware compatibility
Log In
to view the estimation
2-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The model has no library tag.
Model tree for ilintar/THUDM-GLM-4-32B-0414-IQ2_S.GGUF
Base model
THUDM/GLM-4-32B-0414