Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Mungert
/
Mistral-Small-3.1-24B-Instruct-2503-GGUF
like
9
Image-Text-to-Text
GGUF
24 languages
vllm
imatrix
conversational
License:
apache-2.0
Model card
Files
Files and versions
xet
Community
1
Use this model
401ab0a
Mistral-Small-3.1-24B-Instruct-2503-GGUF
Ctrl+K
Ctrl+K
1 contributor
History:
12 commits
Mungert
Upload Mistral-Small-3.1-24B-Instruct-2503-q2_k_l.gguf with huggingface_hub
401ab0a
verified
4 months ago
.gitattributes
2.46 kB
Upload Mistral-Small-3.1-24B-Instruct-2503-q2_k_l.gguf with huggingface_hub
4 months ago
Mistral-Small-3.1-24B-Instruct-2503-bf16-q4_k.gguf
Safe
16.1 GB
xet
Upload Mistral-Small-3.1-24B-Instruct-2503-bf16-q4_k.gguf with huggingface_hub
4 months ago
Mistral-Small-3.1-24B-Instruct-2503-bf16-q6_k.gguf
Safe
20.9 GB
xet
Upload Mistral-Small-3.1-24B-Instruct-2503-bf16-q6_k.gguf with huggingface_hub
4 months ago
Mistral-Small-3.1-24B-Instruct-2503-bf16-q8_0.gguf
Safe
26.3 GB
xet
Upload Mistral-Small-3.1-24B-Instruct-2503-bf16-q8_0.gguf with huggingface_hub
4 months ago
Mistral-Small-3.1-24B-Instruct-2503-f16-q4_k.gguf
Safe
16.1 GB
xet
Upload Mistral-Small-3.1-24B-Instruct-2503-f16-q4_k.gguf with huggingface_hub
4 months ago
Mistral-Small-3.1-24B-Instruct-2503-f16-q6_k.gguf
Safe
20.9 GB
xet
Upload Mistral-Small-3.1-24B-Instruct-2503-f16-q6_k.gguf with huggingface_hub
4 months ago
Mistral-Small-3.1-24B-Instruct-2503-f16-q8_0.gguf
Safe
26.3 GB
xet
Upload Mistral-Small-3.1-24B-Instruct-2503-f16-q8_0.gguf with huggingface_hub
4 months ago
Mistral-Small-3.1-24B-Instruct-2503-q2_k_l.gguf
Safe
9.55 GB
xet
Upload Mistral-Small-3.1-24B-Instruct-2503-q2_k_l.gguf with huggingface_hub
4 months ago
Mistral-Small-3.1-24B-Instruct-2503-q4_k_l.gguf
Safe
14.8 GB
xet
Upload Mistral-Small-3.1-24B-Instruct-2503-q4_k_l.gguf with huggingface_hub
4 months ago
Mistral-Small-3.1-24B-Instruct-2503-q4_k_m.gguf
Safe
14.3 GB
xet
Upload Mistral-Small-3.1-24B-Instruct-2503-q4_k_m.gguf with huggingface_hub
4 months ago
Mistral-Small-3.1-24B-Instruct-2503-q4_k_s.gguf
Safe
13.5 GB
xet
Upload Mistral-Small-3.1-24B-Instruct-2503-q4_k_s.gguf with huggingface_hub
4 months ago
Mistral-Small-3.1-24B-Instruct-2503-q5_k_m.gguf
Safe
16.8 GB
xet
Upload Mistral-Small-3.1-24B-Instruct-2503-q5_k_m.gguf with huggingface_hub
4 months ago