nerui-pt-pl50-1

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0647
  • Location Precision: 0.9008
  • Location Recall: 0.9397
  • Location F1: 0.9198
  • Location Number: 116
  • Organization Precision: 0.9484
  • Organization Recall: 0.9304
  • Organization F1: 0.9393
  • Organization Number: 158
  • Person Precision: 0.984
  • Person Recall: 0.9919
  • Person F1: 0.9880
  • Person Number: 124
  • Overall Precision: 0.9451
  • Overall Recall: 0.9523
  • Overall F1: 0.9487
  • Overall Accuracy: 0.9876

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8754 1.0 96 0.4168 0.0 0.0 0.0 116 0.2295 0.0886 0.1279 158 0.2059 0.0565 0.0886 124 0.2165 0.0528 0.0848 0.8471
0.3618 2.0 192 0.1870 0.5276 0.5776 0.5514 116 0.6203 0.6203 0.6203 158 0.7535 0.8629 0.8045 124 0.6370 0.6834 0.6594 0.9473
0.1871 3.0 288 0.0962 0.8017 0.8017 0.8017 116 0.75 0.8165 0.7818 158 0.9528 0.9758 0.9641 124 0.8265 0.8618 0.8438 0.9728
0.1327 4.0 384 0.0799 0.8015 0.9052 0.8502 116 0.7624 0.8734 0.8142 158 0.9762 0.9919 0.9840 124 0.8356 0.9196 0.8756 0.9742
0.1113 5.0 480 0.0618 0.8679 0.7931 0.8288 116 0.8090 0.9114 0.8571 158 0.9683 0.9839 0.976 124 0.8732 0.8995 0.8861 0.9778
0.0987 6.0 576 0.0525 0.8295 0.9224 0.8735 116 0.9038 0.8924 0.8981 158 0.9685 0.9919 0.9801 124 0.9005 0.9322 0.9160 0.9846
0.0886 7.0 672 0.0503 0.9217 0.9138 0.9177 116 0.8362 0.9367 0.8836 158 0.984 0.9919 0.9880 124 0.9041 0.9472 0.9252 0.9827
0.0784 8.0 768 0.0430 0.8678 0.9052 0.8861 116 0.9045 0.8987 0.9016 158 0.9762 0.9919 0.9840 124 0.9158 0.9296 0.9227 0.9844
0.0697 9.0 864 0.0408 0.8871 0.9483 0.9167 116 0.9006 0.9177 0.9091 158 0.9839 0.9839 0.9839 124 0.9218 0.9472 0.9343 0.9860
0.0622 10.0 960 0.0444 0.8790 0.9397 0.9083 116 0.9068 0.9241 0.9154 158 0.9609 0.9919 0.9762 124 0.9153 0.9497 0.9322 0.9846
0.0608 11.0 1056 0.0410 0.9244 0.9483 0.9362 116 0.9434 0.9494 0.9464 158 0.984 0.9919 0.9880 124 0.9504 0.9623 0.9563 0.9876
0.0534 12.0 1152 0.0352 0.9237 0.9397 0.9316 116 0.9481 0.9241 0.9359 158 0.984 0.9919 0.9880 124 0.9521 0.9497 0.9509 0.9893
0.0523 13.0 1248 0.0367 0.9174 0.9569 0.9367 116 0.9430 0.9430 0.9430 158 0.984 0.9919 0.9880 124 0.9480 0.9623 0.9551 0.9885
0.0454 14.0 1344 0.0386 0.9083 0.9397 0.9237 116 0.9236 0.9177 0.9206 158 0.984 0.9919 0.9880 124 0.9378 0.9472 0.9425 0.9871
0.0473 15.0 1440 0.0402 0.9091 0.9483 0.9283 116 0.9427 0.9367 0.9397 158 0.9762 0.9919 0.9840 124 0.9431 0.9573 0.9501 0.9879
0.0428 16.0 1536 0.0376 0.9024 0.9569 0.9289 116 0.9177 0.9177 0.9177 158 0.984 0.9919 0.9880 124 0.9335 0.9523 0.9428 0.9882
0.0402 17.0 1632 0.0414 0.9328 0.9569 0.9447 116 0.9074 0.9304 0.9187 158 0.9762 0.9919 0.9840 124 0.9361 0.9573 0.9466 0.9882
0.0374 18.0 1728 0.0414 0.9244 0.9483 0.9362 116 0.9423 0.9304 0.9363 158 0.9762 0.9919 0.9840 124 0.9476 0.9548 0.9512 0.9887
0.0373 19.0 1824 0.0387 0.9217 0.9138 0.9177 116 0.9245 0.9304 0.9274 158 0.984 0.9919 0.9880 124 0.9424 0.9447 0.9435 0.9876
0.0343 20.0 1920 0.0460 0.9153 0.9310 0.9231 116 0.8902 0.9241 0.9068 158 0.984 0.9919 0.9880 124 0.9263 0.9472 0.9366 0.9852
0.0355 21.0 2016 0.0434 0.8889 0.9655 0.9256 116 0.9281 0.8987 0.9132 158 0.984 0.9919 0.9880 124 0.9332 0.9472 0.9401 0.9868
0.0315 22.0 2112 0.0415 0.8689 0.9138 0.8908 116 0.9295 0.9177 0.9236 158 0.984 0.9919 0.9880 124 0.9280 0.9397 0.9338 0.9871
0.031 23.0 2208 0.0452 0.9237 0.9397 0.9316 116 0.9130 0.9304 0.9216 158 0.9609 0.9919 0.9762 124 0.9312 0.9523 0.9416 0.9865
0.0287 24.0 2304 0.0481 0.8871 0.9483 0.9167 116 0.9351 0.9114 0.9231 158 0.9685 0.9919 0.9801 124 0.9309 0.9472 0.9390 0.9860
0.0271 25.0 2400 0.0456 0.896 0.9655 0.9295 116 0.9533 0.9051 0.9286 158 0.984 0.9919 0.9880 124 0.945 0.9497 0.9474 0.9885
0.0286 26.0 2496 0.0441 0.9 0.9310 0.9153 116 0.9177 0.9177 0.9177 158 0.984 0.9919 0.9880 124 0.9330 0.9447 0.9388 0.9865
0.0289 27.0 2592 0.0387 0.9083 0.9397 0.9237 116 0.9125 0.9241 0.9182 158 0.984 0.9919 0.9880 124 0.9333 0.9497 0.9415 0.9868
0.0247 28.0 2688 0.0439 0.9160 0.9397 0.9277 116 0.9141 0.9430 0.9283 158 0.9762 0.9919 0.9840 124 0.9338 0.9573 0.9454 0.9879
0.0245 29.0 2784 0.0412 0.9083 0.9397 0.9237 116 0.9308 0.9367 0.9338 158 0.9762 0.9919 0.9840 124 0.9383 0.9548 0.9465 0.9879
0.0227 30.0 2880 0.0437 0.9231 0.9310 0.9270 116 0.9193 0.9367 0.9279 158 0.9762 0.9919 0.9840 124 0.9381 0.9523 0.9451 0.9876
0.0216 31.0 2976 0.0412 0.9160 0.9397 0.9277 116 0.9136 0.9367 0.9250 158 0.984 0.9919 0.9880 124 0.9360 0.9548 0.9453 0.9879
0.022 32.0 3072 0.0425 0.9008 0.9397 0.9198 116 0.9074 0.9304 0.9187 158 0.9762 0.9919 0.9840 124 0.9267 0.9523 0.9393 0.9868
0.0202 33.0 3168 0.0415 0.9174 0.9569 0.9367 116 0.9245 0.9304 0.9274 158 0.984 0.9919 0.9880 124 0.9407 0.9573 0.9489 0.9882
0.0217 34.0 3264 0.0376 0.9 0.9310 0.9153 116 0.925 0.9367 0.9308 158 0.9762 0.9919 0.9840 124 0.9335 0.9523 0.9428 0.9885
0.0208 35.0 3360 0.0405 0.9231 0.9310 0.9270 116 0.9193 0.9367 0.9279 158 0.9762 0.9919 0.9840 124 0.9381 0.9523 0.9451 0.9882
0.02 36.0 3456 0.0448 0.888 0.9569 0.9212 116 0.9342 0.8987 0.9161 158 0.984 0.9919 0.9880 124 0.9353 0.9447 0.94 0.9876
0.0176 37.0 3552 0.0501 0.9076 0.9310 0.9191 116 0.9068 0.9241 0.9154 158 0.984 0.9919 0.9880 124 0.9309 0.9472 0.9390 0.9865
0.0197 38.0 3648 0.0426 0.9076 0.9310 0.9191 116 0.9367 0.9367 0.9367 158 0.9762 0.9919 0.9840 124 0.9404 0.9523 0.9463 0.9887
0.0164 39.0 3744 0.0417 0.9153 0.9310 0.9231 116 0.9484 0.9304 0.9393 158 0.9683 0.9839 0.976 124 0.9449 0.9472 0.9460 0.9890
0.018 40.0 3840 0.0466 0.9008 0.9397 0.9198 116 0.9245 0.9304 0.9274 158 0.9606 0.9839 0.9721 124 0.9287 0.9497 0.9391 0.9879
0.0178 41.0 3936 0.0426 0.9231 0.9310 0.9270 116 0.8941 0.9620 0.9268 158 0.984 0.9919 0.9880 124 0.9296 0.9623 0.9457 0.9890
0.0166 42.0 4032 0.0427 0.9167 0.9483 0.9322 116 0.9427 0.9367 0.9397 158 0.984 0.9919 0.9880 124 0.9478 0.9573 0.9525 0.9890
0.0166 43.0 4128 0.0445 0.9244 0.9483 0.9362 116 0.9416 0.9177 0.9295 158 0.984 0.9919 0.9880 124 0.9497 0.9497 0.9497 0.9885
0.0162 44.0 4224 0.0490 0.9160 0.9397 0.9277 116 0.9351 0.9114 0.9231 158 0.984 0.9919 0.9880 124 0.9447 0.9447 0.9447 0.9863
0.0143 45.0 4320 0.0536 0.9008 0.9397 0.9198 116 0.9419 0.9241 0.9329 158 0.984 0.9919 0.9880 124 0.9426 0.9497 0.9462 0.9871
0.0157 46.0 4416 0.0547 0.8952 0.9569 0.925 116 0.9355 0.9177 0.9265 158 0.9762 0.9919 0.9840 124 0.9358 0.9523 0.9440 0.9863
0.0134 47.0 4512 0.0549 0.8952 0.9569 0.925 116 0.9481 0.9241 0.9359 158 0.984 0.9919 0.9880 124 0.9429 0.9548 0.9488 0.9882
0.0146 48.0 4608 0.0544 0.8943 0.9483 0.9205 116 0.9412 0.9114 0.9260 158 0.984 0.9919 0.9880 124 0.9401 0.9472 0.9437 0.9882
0.0141 49.0 4704 0.0545 0.9153 0.9310 0.9231 116 0.9484 0.9304 0.9393 158 0.984 0.9919 0.9880 124 0.9497 0.9497 0.9497 0.9885
0.0121 50.0 4800 0.0555 0.9153 0.9310 0.9231 116 0.9355 0.9177 0.9265 158 0.984 0.9919 0.9880 124 0.9447 0.9447 0.9447 0.9879
0.0137 51.0 4896 0.0568 0.8871 0.9483 0.9167 116 0.9603 0.9177 0.9385 158 0.984 0.9919 0.9880 124 0.945 0.9497 0.9474 0.9882
0.0141 52.0 4992 0.0564 0.9024 0.9569 0.9289 116 0.9539 0.9177 0.9355 158 0.984 0.9919 0.9880 124 0.9475 0.9523 0.9499 0.9879
0.0137 53.0 5088 0.0633 0.9091 0.9483 0.9283 116 0.9299 0.9241 0.9270 158 0.9762 0.9919 0.9840 124 0.9381 0.9523 0.9451 0.9865
0.0139 54.0 5184 0.0538 0.9 0.9310 0.9153 116 0.9487 0.9367 0.9427 158 0.984 0.9919 0.9880 124 0.9451 0.9523 0.9487 0.9882
0.0122 55.0 5280 0.0595 0.9 0.9310 0.9153 116 0.9359 0.9241 0.9299 158 0.984 0.9919 0.9880 124 0.9401 0.9472 0.9437 0.9871
0.0129 56.0 5376 0.0542 0.8952 0.9569 0.925 116 0.9481 0.9241 0.9359 158 0.984 0.9919 0.9880 124 0.9429 0.9548 0.9488 0.9879
0.0116 57.0 5472 0.0537 0.9244 0.9483 0.9362 116 0.9363 0.9304 0.9333 158 0.984 0.9919 0.9880 124 0.9476 0.9548 0.9512 0.9882
0.0114 58.0 5568 0.0576 0.8952 0.9569 0.925 116 0.9477 0.9177 0.9325 158 0.984 0.9919 0.9880 124 0.9428 0.9523 0.9475 0.9876
0.0108 59.0 5664 0.0537 0.9167 0.9483 0.9322 116 0.9427 0.9367 0.9397 158 0.984 0.9919 0.9880 124 0.9478 0.9573 0.9525 0.9887
0.0119 60.0 5760 0.0636 0.9091 0.9483 0.9283 116 0.9286 0.9051 0.9167 158 0.984 0.9919 0.9880 124 0.94 0.9447 0.9424 0.9876
0.0129 61.0 5856 0.0602 0.9153 0.9310 0.9231 116 0.9236 0.9177 0.9206 158 0.984 0.9919 0.9880 124 0.94 0.9447 0.9424 0.9876
0.0109 62.0 5952 0.0561 0.9160 0.9397 0.9277 116 0.9245 0.9304 0.9274 158 0.984 0.9919 0.9880 124 0.9404 0.9523 0.9463 0.9882
0.0105 63.0 6048 0.0564 0.9167 0.9483 0.9322 116 0.9299 0.9241 0.9270 158 0.984 0.9919 0.9880 124 0.9428 0.9523 0.9475 0.9885
0.0114 64.0 6144 0.0553 0.9091 0.9483 0.9283 116 0.9172 0.9114 0.9143 158 0.9762 0.9919 0.9840 124 0.9332 0.9472 0.9401 0.9871
0.0103 65.0 6240 0.0584 0.8934 0.9397 0.9160 116 0.9177 0.9177 0.9177 158 0.9762 0.9919 0.9840 124 0.9286 0.9472 0.9378 0.9865
0.0109 66.0 6336 0.0594 0.9106 0.9655 0.9372 116 0.9416 0.9177 0.9295 158 0.984 0.9919 0.9880 124 0.9453 0.9548 0.95 0.9876
0.0097 67.0 6432 0.0575 0.9024 0.9569 0.9289 116 0.9477 0.9177 0.9325 158 0.984 0.9919 0.9880 124 0.9451 0.9523 0.9487 0.9882
0.0102 68.0 6528 0.0582 0.9016 0.9483 0.9244 116 0.9290 0.9114 0.9201 158 0.984 0.9919 0.9880 124 0.9378 0.9472 0.9425 0.9871
0.0106 69.0 6624 0.0661 0.9016 0.9483 0.9244 116 0.9412 0.9114 0.9260 158 0.984 0.9919 0.9880 124 0.9425 0.9472 0.9449 0.9874
0.0092 70.0 6720 0.0558 0.8952 0.9569 0.925 116 0.96 0.9114 0.9351 158 0.984 0.9919 0.9880 124 0.9474 0.9497 0.9486 0.9887
0.0099 71.0 6816 0.0586 0.9024 0.9569 0.9289 116 0.9671 0.9304 0.9484 158 0.984 0.9919 0.9880 124 0.9525 0.9573 0.9549 0.9887
0.0081 72.0 6912 0.0618 0.9091 0.9483 0.9283 116 0.9355 0.9177 0.9265 158 0.984 0.9919 0.9880 124 0.9426 0.9497 0.9462 0.9879
0.0098 73.0 7008 0.0564 0.9016 0.9483 0.9244 116 0.9182 0.9241 0.9211 158 0.9762 0.9919 0.9840 124 0.9312 0.9523 0.9416 0.9879
0.0087 74.0 7104 0.0612 0.8943 0.9483 0.9205 116 0.9542 0.9241 0.9389 158 0.984 0.9919 0.9880 124 0.9451 0.9523 0.9487 0.9876
0.0086 75.0 7200 0.0575 0.9024 0.9569 0.9289 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9501 0.9573 0.9537 0.9882
0.0096 76.0 7296 0.0598 0.8952 0.9569 0.925 116 0.9419 0.9241 0.9329 158 0.984 0.9919 0.9880 124 0.9406 0.9548 0.9476 0.9876
0.0091 77.0 7392 0.0563 0.9091 0.9483 0.9283 116 0.9536 0.9114 0.9320 158 0.984 0.9919 0.9880 124 0.9496 0.9472 0.9484 0.9885
0.0078 78.0 7488 0.0562 0.9153 0.9310 0.9231 116 0.9363 0.9304 0.9333 158 0.984 0.9919 0.9880 124 0.945 0.9497 0.9474 0.9885
0.0074 79.0 7584 0.0587 0.9083 0.9397 0.9237 116 0.9427 0.9367 0.9397 158 0.984 0.9919 0.9880 124 0.9453 0.9548 0.95 0.9887
0.0083 80.0 7680 0.0596 0.9091 0.9483 0.9283 116 0.9363 0.9304 0.9333 158 0.984 0.9919 0.9880 124 0.9429 0.9548 0.9488 0.9887
0.0081 81.0 7776 0.0574 0.9083 0.9397 0.9237 116 0.9427 0.9367 0.9397 158 0.984 0.9919 0.9880 124 0.9453 0.9548 0.95 0.9893
0.0085 82.0 7872 0.0577 0.9024 0.9569 0.9289 116 0.9605 0.9241 0.9419 158 0.984 0.9919 0.9880 124 0.95 0.9548 0.9524 0.9879
0.0073 83.0 7968 0.0588 0.9008 0.9397 0.9198 116 0.9470 0.9051 0.9256 158 0.984 0.9919 0.9880 124 0.9446 0.9422 0.9434 0.9876
0.0083 84.0 8064 0.0650 0.9008 0.9397 0.9198 116 0.9474 0.9114 0.9290 158 0.984 0.9919 0.9880 124 0.9447 0.9447 0.9447 0.9874
0.0075 85.0 8160 0.0645 0.9153 0.9310 0.9231 116 0.9416 0.9177 0.9295 158 0.984 0.9919 0.9880 124 0.9471 0.9447 0.9459 0.9876
0.0083 86.0 8256 0.0633 0.9083 0.9397 0.9237 116 0.9367 0.9367 0.9367 158 0.984 0.9919 0.9880 124 0.9429 0.9548 0.9488 0.9874
0.0074 87.0 8352 0.0628 0.8934 0.9397 0.9160 116 0.9542 0.9241 0.9389 158 0.984 0.9919 0.9880 124 0.945 0.9497 0.9474 0.9874
0.0077 88.0 8448 0.0651 0.8943 0.9483 0.9205 116 0.9542 0.9241 0.9389 158 0.984 0.9919 0.9880 124 0.9451 0.9523 0.9487 0.9876
0.0064 89.0 8544 0.0627 0.9 0.9310 0.9153 116 0.9484 0.9304 0.9393 158 0.984 0.9919 0.9880 124 0.945 0.9497 0.9474 0.9879
0.0072 90.0 8640 0.0626 0.9083 0.9397 0.9237 116 0.9542 0.9241 0.9389 158 0.984 0.9919 0.9880 124 0.9497 0.9497 0.9497 0.9882
0.0065 91.0 8736 0.0647 0.8943 0.9483 0.9205 116 0.9477 0.9177 0.9325 158 0.984 0.9919 0.9880 124 0.9426 0.9497 0.9462 0.9871
0.0064 92.0 8832 0.0628 0.9 0.9310 0.9153 116 0.9484 0.9304 0.9393 158 0.984 0.9919 0.9880 124 0.945 0.9497 0.9474 0.9879
0.0071 93.0 8928 0.0638 0.9016 0.9483 0.9244 116 0.9545 0.9304 0.9423 158 0.984 0.9919 0.9880 124 0.9476 0.9548 0.9512 0.9879
0.0065 94.0 9024 0.0648 0.9008 0.9397 0.9198 116 0.9545 0.9304 0.9423 158 0.984 0.9919 0.9880 124 0.9475 0.9523 0.9499 0.9879
0.0078 95.0 9120 0.0646 0.9 0.9310 0.9153 116 0.9423 0.9304 0.9363 158 0.984 0.9919 0.9880 124 0.9426 0.9497 0.9462 0.9876
0.0067 96.0 9216 0.0638 0.9 0.9310 0.9153 116 0.9484 0.9304 0.9393 158 0.984 0.9919 0.9880 124 0.945 0.9497 0.9474 0.9882
0.0076 97.0 9312 0.0652 0.9008 0.9397 0.9198 116 0.9545 0.9304 0.9423 158 0.984 0.9919 0.9880 124 0.9475 0.9523 0.9499 0.9879
0.0075 98.0 9408 0.0646 0.9 0.9310 0.9153 116 0.9484 0.9304 0.9393 158 0.984 0.9919 0.9880 124 0.945 0.9497 0.9474 0.9879
0.007 99.0 9504 0.0648 0.9008 0.9397 0.9198 116 0.9484 0.9304 0.9393 158 0.984 0.9919 0.9880 124 0.9451 0.9523 0.9487 0.9879
0.0072 100.0 9600 0.0647 0.9008 0.9397 0.9198 116 0.9484 0.9304 0.9393 158 0.984 0.9919 0.9880 124 0.9451 0.9523 0.9487 0.9876

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for apwic/nerui-pt-pl50-1

Finetuned
(372)
this model