jorgeduardo13's picture
NLP Roberta Version! 🤗
abab497
metadata
license: apache-2.0
base_model: distilroberta-base
tags:
  - text-classification
  - generated_from_trainer
datasets:
  - glue
metrics:
  - accuracy
  - f1
model-index:
  - name: platzi_nlp_model_roberta_similaritytext
    results:
      - task:
          name: Text Classification
          type: text-classification
        dataset:
          name: datasetX
          type: glue
          config: mrpc
          split: validation
          args: mrpc
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.7965686274509803
          - name: F1
            type: f1
            value: 0.8482632541133455

platzi_nlp_model_roberta_similaritytext

This model is a fine-tuned version of distilroberta-base on the datasetX dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9276
  • Accuracy: 0.7966
  • F1: 0.8483

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
0.2258 1.09 500 0.9276 0.7966 0.8483
0.1733 2.18 1000 1.1506 0.8186 0.8754
0.1405 3.27 1500 1.2962 0.7990 0.8571
0.0545 4.36 2000 1.3339 0.8137 0.8685

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.4
  • Tokenizers 0.13.3