--- library_name: transformers license: apache-2.0 base_model: google/flan-t5-base tags: - generated_from_trainer model-index: - name: version_1305 results: [] --- # version_1305 This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on the None dataset. It achieves the following results on the evaluation set: - Bp: 0.0692 - Counts: [1132, 692, 368, 143] - Loss: 0.1515 - Precisions: [70.35425730267247, 57.85953177257525, 46.93877551020408, 37.53280839895013] - Ref Len: 5907 - Score: 3.5793 - Sys Len: 1609 - Totals: [1609, 1196, 784, 381] ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 12 - eval_batch_size: 12 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Bp | Counts | Validation Loss | Precisions | Ref Len | Score | Sys Len | Totals | |:-------------:|:-----:|:----:|:------:|:---------------------:|:---------------:|:------------------------------------------------------------------------------:|:-------:|:------:|:-------:|:----------------------:| | 0.1836 | 1.0 | 464 | 0.0693 | [1132, 692, 368, 143] | 0.1625 | [70.31055900621118, 57.811194653299914, 46.87898089171974, 37.43455497382199] | 5907 | 3.5827 | 1610 | [1610, 1197, 785, 382] | | 0.1712 | 2.0 | 928 | 0.0693 | [1136, 696, 371, 145] | 0.1545 | [70.55900621118012, 58.145363408521305, 47.261146496815286, 37.95811518324607] | 5907 | 3.6109 | 1610 | [1610, 1197, 785, 382] | | 0.1626 | 3.0 | 1392 | 0.0692 | [1132, 692, 368, 143] | 0.1515 | [70.35425730267247, 57.85953177257525, 46.93877551020408, 37.53280839895013] | 5907 | 3.5793 | 1609 | [1609, 1196, 784, 381] | ### Framework versions - Transformers 4.44.1 - Pytorch 2.6.0+cu124 - Datasets 2.14.4 - Tokenizers 0.19.1