nllb-200-distilled-600M-int8
This model is a quantized version of facebook/nllb-200-distilled-600M
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
HF Inference deployability: The HF Inference API does not support translation models for fairseq
library.
Model tree for Adeptschneider/nllb-200-distilled-600M-int8
Base model
facebook/nllb-200-distilled-600M