pegasus-large_1742608050.119433

This model is a fine-tuned version of google/pegasus-large on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 4.8799
  • Rouge1: 46.7629
  • Rouge2: 17.2538
  • Rougel: 27.775
  • Rougelsum: 32.6028
  • Gen Len: 85.8611

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 331 5.1130 48.5686 16.6489 28.3029 32.3136 71.2562
5.3363 2.0 662 5.0415 49.2151 17.0482 28.5422 32.4827 70.0833
5.3363 3.0 993 4.9860 48.9739 17.1901 28.4601 32.6516 72.8179
5.0157 4.0 1324 4.9619 48.2866 17.4271 28.1784 32.8011 77.8704
4.9426 5.0 1655 4.9329 48.7697 17.7741 28.6454 33.1256 76.4846
4.9426 6.0 1986 4.9133 48.0678 17.7361 28.3205 32.9328 79.4753
4.863 7.0 2317 4.8973 47.1789 17.4753 27.9637 32.7952 85.8056
4.8272 8.0 2648 4.8786 47.3498 17.2852 27.9143 32.7426 82.9198
4.8272 9.0 2979 4.8805 47.1749 17.4155 27.8944 32.7537 85.1111
4.7888 10.0 3310 4.8799 46.7629 17.2538 27.775 32.6028 85.8611

Framework versions

  • Transformers 4.45.1
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
3
Safetensors
Model size
571M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for tihonn/pegasus-large_1742608050.119433

Finetuned
(58)
this model