Project-Briar Rabbit: Hassenpfeffer 4B Q8 From Hoptimizer's fine work
This Model in not censored and is ready for deployment, not yet tested on a closed system please use and transfer with caution. This model is a Qwen3 mix for use in many applications in systems with 8gb Ram+ no GPU needed.
IntelligentEstate/Hasenpfeffer-4B-Q8_0-GGUF
This model was converted to GGUF format from bunnycore/Qwen3-4B-Mixture
Initial testing is being done in various applications, many show promise. Eat your fill.
- Downloads last month
- 31
Hardware compatibility
Log In
to view the estimation
8-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for IntelligentEstate/Hasenpfeffer-4B-Q8_0-GGUF
Base model
bunnycore/Qwen3-4B-Mixture