Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
ZeroXClem
/
Mistral-2.5-Prima-Hercules-Fusion-7B-Q4_K_M-GGUF
like
0
Transformers
GGUF
English
Merge
mergekit
lazymergekit
hydra-project/ChatHercules-2.5-Mistral-7B
Nitral-Archive/Prima-Pastacles-7b
llama-cpp
gguf-my-repo
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
Mistral-2.5-Prima-Hercules-Fusion-7B-Q4_K_M-GGUF
Ctrl+K
Ctrl+K
1 contributor
History:
3 commits
ZeroXClem
Upload README.md with huggingface_hub
e96a64e
verified
8 months ago
.gitattributes
Safe
1.6 kB
Upload mistral-2.5-prima-hercules-fusion-7b-q4_k_m.gguf with huggingface_hub
8 months ago
README.md
Safe
2.17 kB
Upload README.md with huggingface_hub
8 months ago
mistral-2.5-prima-hercules-fusion-7b-q4_k_m.gguf
Safe
4.37 GB
LFS
Upload mistral-2.5-prima-hercules-fusion-7b-q4_k_m.gguf with huggingface_hub
8 months ago