roleplaiapp/DS-R1-Distill-Q2.5-14B-Harmony_V0.1-Q3_K_M-GGUF
Repo: roleplaiapp/DS-R1-Distill-Q2.5-14B-Harmony_V0.1-Q3_K_M-GGUF
Original Model: DS-R1-Distill-Q2.5-14B-Harmony_V0.1
Quantized File: DS-R1-Distill-Q2.5-14B-Harmony_V0.1.Q3_K_M.gguf
Quantization: GGUF
Quantization Method: Q3_K_M
Overview
This is a GGUF Q3_K_M quantized version of DS-R1-Distill-Q2.5-14B-Harmony_V0.1
Quantization By
I often have idle GPUs while building/testing for the RP app, so I put them to use quantizing models. I hope the community finds these quantizations useful.
Andrew Webby @ RolePlai.
- Downloads last month
- 3
Hardware compatibility
Log In
to view the estimation
3-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support