Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
chatpig
/
gemma-3-4b-it-gguf
like
0
Image-Text-to-Text
GGUF
gguf-connector
conversational
License:
gemma
Model card
Files
Files and versions
Community
Deploy
Use this model
main
gemma-3-4b-it-gguf
Ctrl+K
Ctrl+K
1 contributor
History:
23 commits
chatpig
Upload gemma-3-4b-it-q8_0.gguf with huggingface_hub
029077b
verified
16 days ago
.gitattributes
2.71 kB
Upload gemma-3-4b-it-q8_0.gguf with huggingface_hub
16 days ago
README.md
281 Bytes
Update README.md
16 days ago
gemma-3-4b-it-bf16.gguf
7.77 GB
LFS
Upload gemma-3-4b-it-bf16.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-f16.gguf
7.77 GB
LFS
Upload gemma-3-4b-it-f16.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-f32.gguf
15.5 GB
LFS
Upload gemma-3-4b-it-f32.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q2_k.gguf
1.73 GB
LFS
Upload gemma-3-4b-it-q2_k.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q3_k_l.gguf
2.24 GB
LFS
Upload gemma-3-4b-it-q3_k_l.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q3_k_m.gguf
2.1 GB
LFS
Upload gemma-3-4b-it-q3_k_m.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q3_k_s.gguf
1.94 GB
LFS
Upload gemma-3-4b-it-q3_k_s.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q4_0.gguf
2.36 GB
LFS
Upload gemma-3-4b-it-q4_0.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q4_1.gguf
2.56 GB
LFS
Upload gemma-3-4b-it-q4_1.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q4_k_m.gguf
2.49 GB
LFS
Upload gemma-3-4b-it-q4_k_m.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q4_k_s.gguf
2.38 GB
LFS
Upload gemma-3-4b-it-q4_k_s.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q5_0.gguf
2.76 GB
LFS
Upload gemma-3-4b-it-q5_0.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q5_1.gguf
2.96 GB
LFS
Upload gemma-3-4b-it-q5_1.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q5_k_m.gguf
2.83 GB
LFS
Upload gemma-3-4b-it-q5_k_m.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q5_k_s.gguf
2.76 GB
LFS
Upload gemma-3-4b-it-q5_k_s.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q6_k.gguf
3.19 GB
LFS
Upload gemma-3-4b-it-q6_k.gguf with huggingface_hub
16 days ago
gemma-3-4b-it-q8_0.gguf
4.13 GB
LFS
Upload gemma-3-4b-it-q8_0.gguf with huggingface_hub
16 days ago
mmproj-bf16.gguf
Safe
851 MB
LFS
Upload mmproj-bf16.gguf with huggingface_hub
16 days ago
mmproj-f16.gguf
Safe
851 MB
LFS
Upload mmproj-f16.gguf with huggingface_hub
16 days ago
mmproj-f32.gguf
1.68 GB
LFS
Upload mmproj-f32.gguf with huggingface_hub
16 days ago