Quantized int8 version of nllb-200-distilled-1.3B
Source: https://forum.opennmt.net/t/nllb-200-with-ctranslate2/5090/1
Implemented in my translation project: https://github.com/BLCK-B/Moerkepub
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model's library.
Model tree for BLCK-B/nllb-ctranslate-int8
Base model
facebook/nllb-200-distilled-1.3B