arcleife's picture
Update README.md
fbe9a85 verified
---
library_name: transformers
license: apache-2.0
datasets:
- indonlp/indonlu
language:
- id
metrics:
- f1
- accuracy
- recall
base_model:
- FacebookAI/xlm-roberta-base
---
# Model Card for Model ID
Sentiment analysis model for Indonesian language. Built from [xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base) using [indonlp/indonlu](https://huggingface.co/datasets/indonlp/indonlu) dataset.
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
- **Developed by:** [Muhamad Rizky Yanuar](https://arcleife.github.io/portfolio/)
- **Model type:** [RoBERTa](https://huggingface.co/docs/transformers/en/model_doc/roberta)
- **Language(s) (NLP):** [Indonesian]
- **License:** [Apache license 2.0](https://www.apache.org/licenses/LICENSE-2.0)
- **Finetuned from model:** [xlm-roberta-base](https://huggingface.co/FacebookAI/xlm-roberta-base)
## Training Details
### Training Data
<!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[Sentiment analysis dataset on indolu](https://huggingface.co/datasets/indonlp/indonlu) created by indonlp.
### Training
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
Refer [here](https://github.com/arcleife/notebooks/blob/main/sentiment_finetuning.py).
**Training hyperparameters**
- num_train_epochs = 5
- learning_rate = 5e-6
- weight_decay = 1e-1
- per_device_train_batch_size = 16
- per_device_eval_batch_size = 16
- fp16 = True
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
|Epoch| Training Loss | Validation Loss | F1 | Recall | Precision |
|-----|---------------|-----------------|----------|----------|-----------|
|1 | No log | 0.283834 | 0.908730 | 0.908730 | 0.908730 |
|2 | No log | 0.248232 | 0.930952 | 0.930952 | 0.930952 |
|3 | No log | 0.282172 | 0.930952 | 0.930952 | 0.930952 |
|4 | No log | 0.257302 | 0.936508 | 0.936508 | 0.936508 |
|5 | No log | 0.271212 | 0.939683 | 0.939683 | 0.939683 |