only ct2 float16 convert

ct2-transformers-converter --model facebook/nllb-200-distilled-1.3B --quantization float16 --trust_remote_code --output_dir nllb-200-distilled-1.3B-ct2-float16

Original author: Facebook

Original author model URL: facebook/nllb-200-distilled-1.3B

Downloads last month
40
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.