The deepseek-ai/DeepSeek-R1-Distill-Llama-70B model quantized to fp8.
- Downloads last month
- 106
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for jsbaicenter/r1-1776-distill-llama-70b-FP8-Dynamic
Base model
deepseek-ai/DeepSeek-R1-Distill-Llama-70B