mt5-small-finetuned-amazon-en-es

This model is a fine-tuned version of google/mt5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.0627
  • Rouge1: 16.0196
  • Rouge2: 8.1799
  • Rougel: 15.706
  • Rougelsum: 15.665

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 8

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
5.2215 1.0 3183 3.2285 15.3571 7.5876 14.8818 14.921
3.5658 2.0 6366 3.1463 15.6259 7.5423 15.066 15.0553
3.337 3.0 9549 3.0946 15.952 7.9278 15.522 15.5298
3.2058 4.0 12732 3.0638 14.9256 6.9173 14.6062 14.5762
3.1088 5.0 15915 3.0617 15.7701 7.7605 15.3497 15.4116
3.0411 6.0 19098 3.0561 15.5238 7.749 15.0419 15.1077
3.0102 7.0 22281 3.0544 15.4735 7.2486 15.0972 15.1334
2.9787 8.0 25464 3.0627 16.0196 8.1799 15.706 15.665

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.5.1+cu121
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
16
Safetensors
Model size
300M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Lrally1/mt5-small-finetuned-amazon-en-es

Base model

google/mt5-small
Finetuned
(369)
this model