xml-roberta-large-finetuned-sp-ner-mama-biomedical-corregido-BERT
This model is a fine-tuned version of dccuchile/bert-base-spanish-wwm-cased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0341
- Precision: 0.9609
- Recall: 0.9721
- F1: 0.9665
- Accuracy: 0.9924
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_ratio: 0.2
- num_epochs: 20
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
2.6667 | 1.0 | 86 | 2.3918 | 0.0 | 0.0 | 0.0 | 0.5532 |
1.3707 | 2.0 | 172 | 1.2955 | 0.3450 | 0.2604 | 0.2968 | 0.7181 |
0.8741 | 3.0 | 258 | 0.7168 | 0.5606 | 0.5266 | 0.5431 | 0.8370 |
0.5044 | 4.0 | 344 | 0.3668 | 0.6964 | 0.7486 | 0.7215 | 0.9188 |
0.3046 | 5.0 | 430 | 0.2093 | 0.7995 | 0.8387 | 0.8186 | 0.9523 |
0.2105 | 6.0 | 516 | 0.1476 | 0.8632 | 0.8914 | 0.8771 | 0.9654 |
0.1733 | 7.0 | 602 | 0.1131 | 0.8765 | 0.9125 | 0.8941 | 0.9723 |
0.1504 | 8.0 | 688 | 0.0910 | 0.8838 | 0.9304 | 0.9065 | 0.9775 |
0.1173 | 9.0 | 774 | 0.0732 | 0.9120 | 0.9446 | 0.9280 | 0.9831 |
0.1027 | 10.0 | 860 | 0.0640 | 0.9217 | 0.9499 | 0.9356 | 0.9854 |
0.0741 | 11.0 | 946 | 0.0570 | 0.9371 | 0.9584 | 0.9476 | 0.9879 |
0.0741 | 12.0 | 1032 | 0.0512 | 0.9366 | 0.9573 | 0.9468 | 0.9877 |
0.0748 | 13.0 | 1118 | 0.0448 | 0.9466 | 0.9631 | 0.9548 | 0.9898 |
0.0559 | 14.0 | 1204 | 0.0413 | 0.9522 | 0.9663 | 0.9592 | 0.9907 |
0.0611 | 15.0 | 1290 | 0.0388 | 0.9563 | 0.9694 | 0.9628 | 0.9913 |
0.053 | 16.0 | 1376 | 0.0361 | 0.9573 | 0.9700 | 0.9636 | 0.9917 |
0.0615 | 17.0 | 1462 | 0.0346 | 0.9583 | 0.9700 | 0.9641 | 0.9919 |
0.0452 | 18.0 | 1548 | 0.0341 | 0.9609 | 0.9721 | 0.9665 | 0.9924 |
0.0534 | 19.0 | 1634 | 0.0344 | 0.9604 | 0.9715 | 0.9659 | 0.9922 |
0.048 | 19.7719 | 1700 | 0.0344 | 0.9604 | 0.9715 | 0.9659 | 0.9922 |
Framework versions
- Transformers 4.50.2
- Pytorch 2.5.1+cu124
- Datasets 3.5.0
- Tokenizers 0.21.0
- Downloads last month
- 10
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for anvorja/xml-roberta-large-finetuned-sp-ner-mama-biomedical-corregido-BERT
Base model
dccuchile/bert-base-spanish-wwm-cased