Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
dahara1
/
gemma-3-1b-it-qat-japanese-imatrix
like
1
GGUF
Japanese
imatrix
conversational
Model card
Files
Files and versions
Community
Deploy
Use this model
main
gemma-3-1b-it-qat-japanese-imatrix
Ctrl+K
Ctrl+K
1 contributor
History:
4 commits
dahara1
Update README.md
3eff3ad
verified
about 1 month ago
.gitattributes
3.28 kB
Upload 20 files
about 2 months ago
README.md
1.14 kB
Update README.md
about 1 month ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-IQ3_M.gguf
697 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-IQ3_XS.gguf
690 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-IQ3_XXS.gguf
680 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-IQ4_XS.gguf
714 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q3_K-f16.gguf
1.01 GB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q3_K_L.gguf
722 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q3_K_M.gguf
722 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q3_K_S.gguf
689 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q4_K-f16.gguf
1.09 GB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q4_K_L.gguf
806 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q4_K_M.gguf
806 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q4_K_S.gguf
781 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q5_K-f16.gguf
1.13 GB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q5_K_L.gguf
851 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q5_K_M.gguf
851 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q5_K_S.gguf
836 MB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q6_K-f16.gguf
1.29 GB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q6_K.gguf
1.01 GB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q6_K_L.gguf
1.01 GB
LFS
Upload 20 files
about 2 months ago
gemma-3-1b-it-qat-q4_0-japanese-imatrix-Q8_0.gguf
1.35 GB
LFS
Upload 20 files
about 2 months ago