rut5-base-detox-v2

Model was fine-tuned from sberbank-ai/ruT5-base on parallel detoxification corpus.

  • Task: text2text generation
  • Type: encoder-decoder
  • Tokenizer: bpe
  • Dict size: 32 101
  • Num Parameters: 222 M

Downloads last month
112
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Space using orzhan/rut5-base-detox-v2 1