aesthetic_attribute_classifier
This model is a fine-tuned version of distilbert-base-uncased on the PCCD dataset. It achieves the following results on the evaluation set:
- Loss: 0.3976
- Precision: {'precision': 0.877129341279301}
- Recall: {'recall': 0.8751381215469614}
- F1: {'f1': 0.875529982855803}
- Accuracy: {'accuracy': 0.8751381215469614}
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
0.452 | 1.0 | 1528 | 0.4109 | {'precision': 0.8632779077963935} | {'recall': 0.8615101289134438} | {'f1': 0.8618616182904953} | {'accuracy': 0.8615101289134438} |
0.3099 | 2.0 | 3056 | 0.3976 | {'precision': 0.877129341279301} | {'recall': 0.8751381215469614} | {'f1': 0.875529982855803} | {'accuracy': 0.8751381215469614} |
0.227 | 3.0 | 4584 | 0.4320 | {'precision': 0.876211408446225} | {'recall': 0.874401473296501} | {'f1': 0.8747427955387239} | {'accuracy': 0.874401473296501} |
0.1645 | 4.0 | 6112 | 0.4840 | {'precision': 0.8724641667216837} | {'recall': 0.8714548802946593} | {'f1': 0.8714577820909117} | {'accuracy': 0.8714548802946593} |
0.1141 | 5.0 | 7640 | 0.5083 | {'precision': 0.8755445355051571} | {'recall': 0.8747697974217311} | {'f1': 0.8748766125899489} | {'accuracy': 0.8747697974217311} |
Framework versions
- Transformers 4.16.2
- Pytorch 1.10.2+cu113
- Datasets 1.18.3
- Tokenizers 0.11.0
- Downloads last month
- 24
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.