metadata
base_model:
- deepseek-ai/DeepSeek-V3-Base
pipeline_tag: text-generation
Llama.cpp Quantized based on this Llama.cpp MR big thanks to fairydreaming!
The quantization has been performed on my BF16 version DevQuasar/deepseek-ai.DeepSeek-V3-Base-bf16
Inference proof:
I'm doing this to 'Make knowledge free for everyone', using my personal time and resources.
If you want to support my efforts please visit my ko-fi page: https://ko-fi.com/devquasar
Also feel free to visit my website https://devquasar.com/