samaksh-khatri-crest-data's picture
Saving model gmra_distilbert/distilbert-base-uncased-finetuned-sst-2-english_07112024T180048
cfa581b verified
metadata
library_name: transformers
license: apache-2.0
base_model: distilbert/distilbert-base-uncased-finetuned-sst-2-english
tags:
  - generated_from_trainer
metrics:
  - f1
model-index:
  - name: distilbert-base-uncased-finetuned-sst-2-english_07112024T180048
    results: []

distilbert-base-uncased-finetuned-sst-2-english_07112024T180048

This model is a fine-tuned version of distilbert/distilbert-base-uncased-finetuned-sst-2-english on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4594
  • F1: 0.8486
  • Learning Rate: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 600
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss F1 Rate
No log 0.9942 86 1.7769 0.0796 0.0000
No log 2.0 173 1.7161 0.2057 0.0000
No log 2.9942 259 1.6015 0.3386 0.0000
No log 4.0 346 1.4321 0.4359 0.0000
No log 4.9942 432 1.2548 0.5262 0.0000
1.5627 6.0 519 1.0659 0.5697 0.0000
1.5627 6.9942 605 0.9261 0.6140 1e-05
1.5627 8.0 692 0.8197 0.6649 0.0000
1.5627 8.9942 778 0.7274 0.7060 0.0000
1.5627 10.0 865 0.6496 0.7548 0.0000
1.5627 10.9942 951 0.5997 0.7741 0.0000
0.7439 12.0 1038 0.5462 0.8076 0.0000
0.7439 12.9942 1124 0.5300 0.8141 0.0000
0.7439 14.0 1211 0.4898 0.8314 0.0000
0.7439 14.9942 1297 0.4791 0.8382 0.0000
0.7439 16.0 1384 0.4756 0.8399 0.0000
0.7439 16.9942 1470 0.4668 0.8457 0.0000
0.3573 18.0 1557 0.4600 0.8489 5e-07
0.3573 18.9942 1643 0.4594 0.8486 1e-07
0.3573 19.8844 1720 0.4596 0.8483 0.0

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.5.1+cu124
  • Datasets 3.1.0
  • Tokenizers 0.19.1