👨❄🍜PhoBERT human transfer learning syllable
Collection
PhoBERT TL training for HSD - with human-reference annotated data. Numbers denote different seeds
•
5 items
•
Updated
This model is a fine-tuned version of vinai/phobert-base on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | F1 |
---|---|---|---|---|---|---|---|
No log | 1.0 | 346 | 0.5156 | 0.8237 | 0.5296 | 0.3491 | 0.3313 |
0.5885 | 2.0 | 692 | 0.4706 | 0.8316 | 0.6120 | 0.4056 | 0.4188 |
0.4669 | 3.0 | 1038 | 0.4642 | 0.8323 | 0.6244 | 0.4044 | 0.4196 |
0.4669 | 4.0 | 1384 | 0.4664 | 0.8365 | 0.6932 | 0.4151 | 0.4351 |
0.4617 | 5.0 | 1730 | 0.4633 | 0.8361 | 0.6690 | 0.4145 | 0.4366 |
0.4579 | 6.0 | 2076 | 0.4603 | 0.8365 | 0.6619 | 0.4273 | 0.4483 |
0.4579 | 7.0 | 2422 | 0.4622 | 0.8365 | 0.6839 | 0.4147 | 0.4373 |
0.4554 | 8.0 | 2768 | 0.4528 | 0.8394 | 0.6474 | 0.4414 | 0.4654 |
0.4544 | 9.0 | 3114 | 0.4597 | 0.8338 | 0.6444 | 0.4071 | 0.4271 |
0.4544 | 10.0 | 3460 | 0.4606 | 0.8338 | 0.6544 | 0.4356 | 0.4510 |
0.4587 | 11.0 | 3806 | 0.4589 | 0.8353 | 0.6548 | 0.4105 | 0.4342 |
0.4488 | 12.0 | 4152 | 0.4534 | 0.8394 | 0.6550 | 0.4705 | 0.4951 |
0.4488 | 13.0 | 4498 | 0.4554 | 0.8361 | 0.6420 | 0.4216 | 0.4483 |
0.4575 | 14.0 | 4844 | 0.4562 | 0.8353 | 0.6472 | 0.4113 | 0.4332 |
0.4491 | 15.0 | 5190 | 0.4593 | 0.8338 | 0.6609 | 0.4020 | 0.4219 |
0.4586 | 16.0 | 5536 | 0.4665 | 0.8327 | 0.6580 | 0.3987 | 0.4161 |
0.4586 | 17.0 | 5882 | 0.4514 | 0.8383 | 0.6232 | 0.4356 | 0.4603 |
Base model
vinai/phobert-base