Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
PrunaAI
/
AMindToThink-gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3-GGUF-smashed
like
0
Follow
Pruna AI
174
GGUF
pruna-ai
Inference Endpoints
Model card
Files
Files and versions
Community
Deploy
Use this model
main
AMindToThink-gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3-GGUF-smashed
/
AMindToThink
1 contributor
History:
15 commits
sharpenb
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q8_0.gguf with huggingface_hub
9048bc8
verified
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q2_K.gguf
1.23 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q2_K.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q3_K_L.gguf
1.55 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q3_K_L.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q3_K_M.gguf
1.46 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q3_K_M.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q3_K_S.gguf
1.36 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q3_K_S.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q4_0.gguf
1.63 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q4_0.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q4_1.gguf
1.76 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q4_1.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q4_K_M.gguf
1.71 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q4_K_M.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q4_K_S.gguf
1.64 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q4_K_S.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q5_0.gguf
1.88 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q5_0.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q5_1.gguf
2.01 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q5_1.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q5_K_M.gguf
1.92 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q5_K_M.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q5_K_S.gguf
1.88 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q5_K_S.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q6_K.gguf
2.15 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q6_K.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q8_0.gguf
2.78 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.Q8_0.gguf with huggingface_hub
28 days ago
gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.fp16.bin
5.24 GB
LFS
Upload AMindToThink/gemma-2-2b_RMU_cyber-forget-corpus_s200_a100_layer3.fp16.bin with huggingface_hub
28 days ago