Spanish T5 (small) trained on large_spanish_corpus.

This is a Spanish T5 (small arch) trained from scratch on the large_spanish_corpus aka BETO's corpus with Flax

This is part of the Flax/Jax Community Week, organised by HuggingFace and TPU usage sponsored by Google.

Dataset

The dataset is about 20 GB. 95% of the data was used for training and the rest 5% for validation.

Metrics

  • Accuracy: 0.675

Team members

Citation

If you want to cite this model you can use this:

@misc{mromero2021spanish-t5-small,
  title={Spanish T5 (small) by Manuel Romero},
  author={Romero, Manuel},
  publisher={Hugging Face},
  journal={Hugging Face Hub},
  howpublished={\url{https://huggingface.co/flax-community/spanish-t5-small}},
  year={2021}
}
Downloads last month
215
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for flax-community/spanish-t5-small

Finetunes
5 models

Dataset used to train flax-community/spanish-t5-small