Quantized version of: huihui-ai/Huihui-gpt-oss-20b-BF16-abliterated
'Make knowledge free for everyone'
- Downloads last month
- -
Hardware compatibility
Log In
to view the estimation
16-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support
Model tree for DevQuasar/huihui-ai.Huihui-gpt-oss-20b-BF16-abliterated-GGUF
Base model
openai/gpt-oss-20b
Finetuned
unsloth/gpt-oss-20b-BF16