--- library_name: transformers license: mit base_model: vinai/phobert-base tags: - generated_from_trainer metrics: - accuracy - precision - recall - f1 model-index: - name: phobert-human-finetune-seed-6969 results: [] --- # phobert-human-finetune-seed-6969 This model is a fine-tuned version of [vinai/phobert-base](https://huggingface.co/vinai/phobert-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.7113 - Accuracy: 0.8578 - Precision: 0.6561 - Recall: 0.6715 - F1: 0.6635 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:------:| | No log | 1.0 | 346 | 0.4059 | 0.8503 | 0.6647 | 0.5987 | 0.5996 | | 0.4526 | 2.0 | 692 | 0.3766 | 0.8623 | 0.6736 | 0.6070 | 0.6324 | | 0.2959 | 3.0 | 1038 | 0.4422 | 0.8705 | 0.7189 | 0.6002 | 0.6446 | | 0.2959 | 4.0 | 1384 | 0.5210 | 0.8641 | 0.6842 | 0.6093 | 0.6351 | | 0.1828 | 5.0 | 1730 | 0.6763 | 0.8376 | 0.6244 | 0.5879 | 0.5898 | | 0.1244 | 6.0 | 2076 | 0.6496 | 0.8439 | 0.6431 | 0.6854 | 0.6595 | | 0.1244 | 7.0 | 2422 | 0.6891 | 0.8582 | 0.6621 | 0.6294 | 0.6444 | | 0.0898 | 8.0 | 2768 | 0.5961 | 0.8544 | 0.6481 | 0.6268 | 0.6368 | | 0.0719 | 9.0 | 3114 | 0.7113 | 0.8578 | 0.6561 | 0.6715 | 0.6635 | | 0.0719 | 10.0 | 3460 | 0.8041 | 0.8507 | 0.6407 | 0.6516 | 0.6450 | | 0.0634 | 11.0 | 3806 | 0.8648 | 0.8638 | 0.6826 | 0.6394 | 0.6560 | | 0.0566 | 12.0 | 4152 | 0.7883 | 0.8529 | 0.6554 | 0.6632 | 0.6530 | | 0.0566 | 13.0 | 4498 | 0.8817 | 0.8626 | 0.6879 | 0.6220 | 0.6495 | | 0.0455 | 14.0 | 4844 | 0.8372 | 0.8548 | 0.6582 | 0.6316 | 0.6439 | ### Framework versions - Transformers 4.51.1 - Pytorch 2.5.1+cu124 - Datasets 3.5.0 - Tokenizers 0.21.0