t5-base-finetuned-amazon-en-es

This model is a fine-tuned version of google-t5/t5-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1275
  • Rouge1: 90.2312
  • Rouge2: 83.2787
  • Rougel: 88.0196
  • Rougelsum: 87.9916

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 12

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
0.113 1.0 97 0.1067 90.4949 83.4088 87.98 87.9287
0.0856 2.0 194 0.1052 90.6604 83.7509 88.1407 88.0726
0.0723 3.0 291 0.1060 91.4193 84.9487 88.9628 88.8729
0.064 4.0 388 0.1119 89.7878 83.0958 87.321 87.2759
0.0556 5.0 485 0.1156 90.5422 83.8358 88.4229 88.3887
0.0515 6.0 582 0.1126 90.4997 83.4321 88.1359 88.1405
0.0456 7.0 679 0.1158 90.5983 83.8471 88.5468 88.4302
0.0468 8.0 776 0.1189 90.3242 83.5413 88.2592 88.2061
0.0416 9.0 873 0.1225 90.2886 83.1885 88.0928 88.0366
0.0385 10.0 970 0.1252 89.8331 82.8606 87.3511 87.335
0.0377 11.0 1067 0.1269 89.9057 83.057 87.6798 87.6802
0.0368 12.0 1164 0.1275 90.2312 83.2787 88.0196 87.9916

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
7
Safetensors
Model size
223M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for JohnDoe70/t5-base-finetuned-amazon-en-es

Base model

google-t5/t5-base
Quantized
(6)
this model