IndicF5-TTS TensorRT-LLM Fast Inference – Zuppppppp πŸš€

Fast inference version of IndicF5 TTS using TensorRT-LLM. Supports 11 Indian languages.


Accelerated inference for IndicF5 TTS – made for those of you who asked for speed! We heard you loud and clear. Sharing with ❀️ from Saryps Labs.

This build is based on the amazing work at wgs/F5-TTS-Faster, which converts F5-TTS checkpoints to run with TensorRT-LLM.

πŸ”§ How it works (a quick overview)

Here’s the basic workflow used for acceleration (as shared in wgs/F5-TTS-Faster):

  • First, export F5-TTS to ONNX in three parts.
  • Then, use TensorRT-LLM to rewrite the relevant Transformer parts of the network for acceleration.
  • The front-end and decoder still use ONNX inference.
  • You can also use CUDAExecutionProvider, OpenVINOExecutionProvider, etc., depending on your setup.

If you're curious about the details, dive into the original repo or ping me β€” happy to expand on anything.


πŸš€ Quickstart

To try it out yourself:

  1. Clone the F5_TTS_Faster repo.
  2. Set up the environment for TensorRT-LLM models (follow the repo instructions – it’s a little convoluted, but manageable).
  3. Place the ckpts folder at the root of the project.
  4. Run the sample script inside export_trtllm/ to begin inference.

Benchmarks coming soon! If you try it out and get some numbers, would be awesome if you can share back 🫑


πŸ—£οΈ Supported Languages

This model supports high-quality TTS synthesis in the following Indian languages:

  • Hindi (hi)
  • Telugu (te)
  • Assamese (as)
  • Bengali (bn)
  • Gujarati (gu)
  • Marathi (mr)
  • Kannada (kn)
  • Malayalam (ml)
  • Odia (or)
  • Punjabi (pa)
  • Tamil (ta)

πŸ™ Credits

Massive thanks to the original authors and repositories that made this possible:


πŸ“œ Terms of Use

By using this model, you agree to only clone voices for which you have explicit permission.
Unauthorized voice cloning is strictly prohibited. Any misuse of this model is the sole responsibility of the user.

Please don’t misuse this. Let’s build cool stuff, not cause trouble.


Downloads last month
0
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for saryps-labs/indicF5-rt

Base model

ai4bharat/IndicF5
Quantized
(1)
this model