Q4_K_M quantization for Mistral-Small-Instruct-2501 model
- Downloads last month
- -
Hardware compatibility
Log In
to view the estimation
4-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for dataplayer12/Mistral-Small-24B-Instruct-Q4_K_M-GGUF
Base model
mistralai/Mistral-Small-24B-Base-2501