gsgoncalves's picture
Update README.md
8eb25a6
metadata
license: apache-2.0
datasets:
  - race
language:
  - en
tags:
  - text classification
  - multiple-choice

Model Card for Model ID

This model was finetuned on RACE for multiple choice (text classification). The initial model used was distilroberta-base https://huggingface.co/distilroberta-base

The model was trained using the code from https://github.com/zphang/lrqa. Please refer to and cite the authors.

Model Details

  • Initial model: distilroberta-base
  • LR: 1e-5
  • Epochs: 3
  • Warmup Ratio: 0.1 (10%)
  • Batch Size: 16
  • Max Seq Len: 512

Model Description

  • Model type: [DistilRoBERTa]
  • Language(s) (NLP): [English]
  • License: [Apache-2.0]
  • Finetuned from model [optional]: [distilroberta-base]

Model Sources [optional]

Bias, Risks, and Limitations

[More Information Needed]

How to Get Started with the Model

Use the code below to get started with the model.

[More Information Needed]

Training Details

Training Data

[More Information Needed]

Model Examination [optional]

[More Information Needed]

Environmental Impact

  • Hardware Type: A100 - 40GB
  • Hours used: 4
  • Cloud Provider: Private
  • Compute Region: Portugal
  • Carbon Emitted: 0.18 kgCO2

Experiments were conducted using a private infrastructure, which has a carbon efficiency of 0.178 kgCO$_2$eq/kWh. A cumulative of 4 hours of computation was performed on hardware of type A100 PCIe 40/80GB (TDP of 250W). Total emissions are estimated to be 0.18 kgCO$_2$eq of which 0 percent were directly offset. Estimations were conducted using the \href{https://mlco2.github.io/impact#compute}{MachineLearning Impact calculator} presented in \cite{lacoste2019quantifying}.