Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
tatsuyaaaaaaa
/
gemma-3-270m-it-gguf
like
0
GGUF
TFMC/imatrix-dataset-for-japanese-llm
English
Japanese
conversational
License:
gemma
Model card
Files
Files and versions
Community
Deploy
Use this model
main
gemma-3-270m-it-gguf
Ctrl+K
Ctrl+K
1 contributor
History:
17 commits
tatsuyaaaaaaa
Create README.md
95687d6
verified
about 16 hours ago
.gitattributes
2.47 kB
Upload gemma-3-270m-it_IQ3_M.gguf with huggingface_hub
about 17 hours ago
README.md
420 Bytes
Create README.md
about 16 hours ago
gemma-3-270m-it_IQ3_M.gguf
239 MB
LFS
Upload gemma-3-270m-it_IQ3_M.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_IQ3_S.gguf
237 MB
LFS
Upload gemma-3-270m-it_IQ3_S.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_IQ3_XS.gguf
237 MB
LFS
Upload gemma-3-270m-it_IQ3_XS.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_IQ4_NL.gguf
242 MB
LFS
Upload gemma-3-270m-it_IQ4_NL.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_IQ4_XS.gguf
241 MB
LFS
Upload gemma-3-270m-it_IQ4_XS.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_Q4_0.gguf
241 MB
LFS
Upload gemma-3-270m-it_Q4_0.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_Q4_K_M.gguf
253 MB
LFS
Upload gemma-3-270m-it_Q4_K_M.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_Q4_K_S.gguf
250 MB
LFS
Upload gemma-3-270m-it_Q4_K_S.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_Q5_0.gguf
254 MB
LFS
Upload gemma-3-270m-it_Q5_0.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_Q5_1.gguf
260 MB
LFS
Upload gemma-3-270m-it_Q5_1.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_Q5_K_M.gguf
260 MB
LFS
Upload gemma-3-270m-it_Q5_K_M.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_Q5_K_S.gguf
258 MB
LFS
Upload gemma-3-270m-it_Q5_K_S.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_Q6_K.gguf
283 MB
LFS
Upload gemma-3-270m-it_Q6_K.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_Q8_0.gguf
292 MB
LFS
Upload gemma-3-270m-it_Q8_0.gguf with huggingface_hub
about 17 hours ago
gemma-3-270m-it_bf16.gguf
543 MB
LFS
Upload gemma-3-270m-it_bf16.gguf with huggingface_hub
about 17 hours ago