Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
dahara1
/
gemma-3-12b-it-qat-japanese-imatrix
like
6
GGUF
Japanese
imatrix
conversational
Model card
Files
Files and versions
Community
Deploy
Use this model
main
gemma-3-12b-it-qat-japanese-imatrix
Ctrl+K
Ctrl+K
1 contributor
History:
17 commits
dahara1
Update README.md
87d294a
verified
3 months ago
.gitattributes
Safe
3.42 kB
Upload gemma-3-12b-it-qat-q4_0-japanese-imatrix-IQ2_M.gguf
3 months ago
README.md
1.28 kB
Update README.md
3 months ago
gemma-3-12B-it-qat-unquantized-BF16.gguf
Safe
23.5 GB
LFS
Upload gemma-3-12B-it-qat-unquantized-BF16.gguf
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-IQ2_M.gguf
4.31 GB
LFS
Upload gemma-3-12b-it-qat-q4_0-japanese-imatrix-IQ2_M.gguf
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-IQ2_XS.gguf
3.84 GB
LFS
Upload 3 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-IQ2_XXS.gguf
3.53 GB
LFS
Upload 3 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-IQ3_M.gguf
Safe
5.66 GB
LFS
Upload 3 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-IQ3_XS.gguf
Safe
5.21 GB
LFS
Upload 5 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-IQ3_XXS.gguf
4.78 GB
LFS
Upload 5 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-IQ4_XS.gguf
Safe
6.55 GB
LFS
Upload 5 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q4_0.gguf
Safe
6.89 GB
LFS
Upload gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q4_0.gguf
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q4_K-f16.gguf
Safe
8.49 GB
LFS
Upload 5 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q4_K_L.gguf
7.54 GB
LFS
Upload 4 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q4_K_M.gguf
Safe
7.3 GB
LFS
Upload 4 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q4_K_S.gguf
6.94 GB
LFS
Upload 4 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q5_K-f16.gguf
9.63 GB
LFS
Upload 4 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q5_K_L.gguf
8.69 GB
LFS
Upload 4 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q5_K_M.gguf
Safe
8.45 GB
LFS
Upload 4 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q5_K_S.gguf
Safe
8.23 GB
LFS
Upload 4 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q6_K-f16.gguf
10.8 GB
LFS
Upload 3 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q6_K.gguf
9.66 GB
LFS
Upload 4 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q6_K_L.gguf
9.9 GB
LFS
Upload 3 files
3 months ago
gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q8_0.gguf
13.5 GB
LFS
Upload gemma-3-12b-it-qat-q4_0-japanese-imatrix-Q8_0.gguf
3 months ago
mmproj.gguf
Safe
854 MB
LFS
Upload 3 files
3 months ago