Visualize in Weights & Biases Visualize in Weights & Biases

ModernBERT-base-2-contract-sections-classification-v4-50-512

This model is a fine-tuned version of answerdotai/ModernBERT-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3958
  • Accuracy Evaluate: 0.9377
  • Precision Evaluate: 0.9454
  • Recall Evaluate: 0.9361
  • F1 Evaluate: 0.9397
  • Accuracy Sklearn: 0.9377
  • Precision Sklearn: 0.9396
  • Recall Sklearn: 0.9377
  • F1 Sklearn: 0.9376
  • Acuracia Rotulo Objeto: 0.9814
  • Acuracia Rotulo Obrigacoes: 0.9630
  • Acuracia Rotulo Valor: 0.9026
  • Acuracia Rotulo Vigencia: 0.9711
  • Acuracia Rotulo Rescisao: 0.9391
  • Acuracia Rotulo Foro: 0.9962
  • Acuracia Rotulo Reajuste: 0.8932
  • Acuracia Rotulo Fiscalizacao: 0.8297
  • Acuracia Rotulo Publicacao: 0.9409
  • Acuracia Rotulo Pagamento: 0.8877
  • Acuracia Rotulo Casos Omissos: 0.9163
  • Acuracia Rotulo Sancoes: 0.9541
  • Acuracia Rotulo Dotacao Orcamentaria: 0.9945

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-06
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Evaluate Precision Evaluate Recall Evaluate F1 Evaluate Accuracy Sklearn Precision Sklearn Recall Sklearn F1 Sklearn Acuracia Rotulo Objeto Acuracia Rotulo Obrigacoes Acuracia Rotulo Valor Acuracia Rotulo Vigencia Acuracia Rotulo Rescisao Acuracia Rotulo Foro Acuracia Rotulo Reajuste Acuracia Rotulo Fiscalizacao Acuracia Rotulo Publicacao Acuracia Rotulo Pagamento Acuracia Rotulo Casos Omissos Acuracia Rotulo Sancoes Acuracia Rotulo Dotacao Orcamentaria
0.5007 1.0 1000 0.8716 0.744 0.8071 0.7604 0.7666 0.744 0.7924 0.744 0.7473 0.9153 0.6835 0.4527 0.6299 0.8310 0.8808 0.7402 0.7413 0.9064 0.5942 0.8227 0.8899 0.7967
0.2031 2.0 2000 0.5793 0.8415 0.8590 0.8558 0.8516 0.8415 0.8525 0.8415 0.8398 0.9298 0.7879 0.5788 0.8504 0.9030 0.9885 0.8648 0.6625 0.9360 0.8804 0.8867 0.8899 0.9670
0.1306 3.0 3000 0.4607 0.8705 0.8683 0.8866 0.8735 0.8705 0.8762 0.8705 0.8708 0.8967 0.8199 0.7794 0.8898 0.7812 0.9962 0.9110 0.7855 0.9655 0.8877 0.9163 0.9358 0.9615
0.1043 4.0 4000 0.4157 0.896 0.9015 0.9058 0.9013 0.896 0.9001 0.896 0.8955 0.9607 0.8232 0.7536 0.9790 0.9252 0.9846 0.8790 0.7950 0.9064 0.9130 0.9064 0.9541 0.9945
0.0779 5.0 5000 0.4192 0.9008 0.9004 0.9124 0.9047 0.9008 0.9042 0.9008 0.9009 0.9731 0.7946 0.8711 0.9291 0.9252 0.9962 0.8932 0.7886 0.9557 0.8949 0.9113 0.9450 0.9835
0.0483 6.0 6000 0.5271 0.8992 0.9081 0.9149 0.9088 0.8992 0.9068 0.8992 0.8990 0.9793 0.7290 0.8567 0.9659 0.9252 0.9923 0.8932 0.8297 0.9951 0.9058 0.9064 0.9266 0.9890
0.0508 7.0 7000 0.4042 0.9087 0.9079 0.9202 0.9116 0.9087 0.9139 0.9087 0.9093 0.9731 0.8064 0.8596 0.9711 0.9169 0.9808 0.9004 0.8328 0.9606 0.8986 0.9064 0.9725 0.9835
0.0389 8.0 8000 0.3789 0.913 0.9114 0.9241 0.9154 0.913 0.9176 0.913 0.9133 0.9752 0.8131 0.8768 0.9738 0.9169 0.9923 0.8897 0.8328 0.9606 0.9094 0.9163 0.9725 0.9835
0.0251 9.0 9000 0.3430 0.929 0.9384 0.9312 0.9338 0.929 0.9311 0.929 0.9289 0.9855 0.9158 0.8768 0.9738 0.9335 0.9923 0.8790 0.8423 0.9754 0.8877 0.8966 0.9633 0.9835
0.0226 10.0 10000 0.4588 0.9073 0.9126 0.9215 0.9128 0.9073 0.9163 0.9073 0.9078 0.9917 0.7609 0.8711 0.9711 0.9197 0.9923 0.8648 0.8612 0.9655 0.9130 0.8916 0.9817 0.9945
0.021 11.0 11000 0.3318 0.9325 0.9357 0.9298 0.9316 0.9325 0.9341 0.9325 0.9322 0.9855 0.9529 0.8854 0.9711 0.9418 0.9923 0.8897 0.8328 0.9015 0.8949 0.9015 0.9541 0.9835
0.0138 12.0 12000 0.3269 0.939 0.9389 0.9399 0.9384 0.939 0.9403 0.939 0.9388 0.9835 0.9529 0.8911 0.9685 0.9446 0.9923 0.8897 0.8486 0.9803 0.8913 0.9113 0.9817 0.9835
0.0144 13.0 13000 0.3691 0.9327 0.9434 0.9340 0.9377 0.9327 0.9359 0.9327 0.9331 0.9897 0.9293 0.8596 0.9633 0.9391 0.9962 0.8897 0.8644 0.9803 0.8877 0.9113 0.9541 0.9780
0.0062 14.0 14000 0.3847 0.9287 0.9412 0.9271 0.9323 0.9287 0.9334 0.9287 0.9291 0.9897 0.9495 0.8911 0.9580 0.9197 0.9962 0.8754 0.8233 0.9113 0.8949 0.9015 0.9633 0.9780
0.0051 15.0 15000 0.3604 0.9335 0.9426 0.9313 0.9355 0.9335 0.9357 0.9335 0.9332 0.9835 0.9613 0.8797 0.9764 0.9363 0.9962 0.8826 0.8328 0.9064 0.8986 0.8867 0.9725 0.9945
0.0059 16.0 16000 0.3538 0.935 0.9421 0.9345 0.9372 0.935 0.9366 0.935 0.9347 0.9814 0.9545 0.8797 0.9711 0.9446 0.9962 0.9004 0.8202 0.9360 0.8949 0.9113 0.9633 0.9945
0.0047 17.0 17000 0.3679 0.9323 0.9331 0.9310 0.9304 0.9323 0.9343 0.9323 0.9319 0.9793 0.9646 0.8911 0.9711 0.9252 1.0 0.8790 0.8076 0.9163 0.8913 0.9163 0.9725 0.9890
0.0045 18.0 18000 0.3664 0.9363 0.9391 0.9381 0.9372 0.9363 0.9377 0.9363 0.9358 0.9835 0.9579 0.8682 0.9685 0.9335 1.0 0.8968 0.8139 0.9951 0.8949 0.9064 0.9817 0.9945
0.004 19.0 19000 0.3635 0.937 0.9382 0.9363 0.9359 0.937 0.9392 0.937 0.9369 0.9897 0.9512 0.9169 0.9659 0.9335 0.9962 0.8754 0.8423 0.9458 0.8804 0.9163 0.9633 0.9945
0.0024 20.0 20000 0.3885 0.9327 0.9427 0.9309 0.9353 0.9327 0.9363 0.9327 0.9329 0.9876 0.9461 0.8682 0.9738 0.9391 0.9962 0.9075 0.8549 0.9064 0.8768 0.8966 0.9541 0.9945
0.0017 21.0 21000 0.3883 0.936 0.9399 0.9341 0.9351 0.936 0.9391 0.936 0.9360 0.9876 0.9613 0.9169 0.9685 0.9224 0.9962 0.8968 0.8202 0.9212 0.8841 0.9163 0.9633 0.9890
0.0016 22.0 22000 0.3651 0.9355 0.9357 0.9337 0.9336 0.9355 0.9373 0.9355 0.9354 0.9814 0.9495 0.9083 0.9738 0.9335 0.9962 0.8897 0.8517 0.9212 0.8877 0.8966 0.9541 0.9945
0.0027 23.0 23000 0.3749 0.9357 0.9397 0.9344 0.9359 0.9357 0.9377 0.9357 0.9358 0.9814 0.9512 0.8883 0.9711 0.9363 0.9962 0.8932 0.8612 0.9212 0.8877 0.9113 0.9541 0.9945
0.0044 24.0 24000 0.3880 0.9335 0.9380 0.9325 0.9341 0.9335 0.9353 0.9335 0.9333 0.9814 0.9596 0.8825 0.9738 0.9197 0.9923 0.8968 0.8328 0.9261 0.8841 0.9163 0.9633 0.9945
0.0028 25.0 25000 0.3880 0.935 0.9398 0.9332 0.9352 0.935 0.9369 0.935 0.9348 0.9835 0.9579 0.8911 0.9711 0.9418 0.9962 0.8897 0.8360 0.9212 0.8841 0.9064 0.9633 0.9890
0.0024 26.0 26000 0.3963 0.9333 0.9366 0.9330 0.9332 0.9333 0.9362 0.9333 0.9334 0.9897 0.9343 0.9083 0.9580 0.9391 1.0 0.8826 0.8454 0.9360 0.8804 0.9015 0.9541 1.0
0.0042 27.0 27000 0.4256 0.9315 0.9319 0.9300 0.9281 0.9315 0.9362 0.9315 0.9318 0.9793 0.9529 0.9284 0.9711 0.9197 0.9962 0.8612 0.8076 0.9261 0.8877 0.9064 0.9541 1.0
0.0032 28.0 28000 0.3806 0.9363 0.9407 0.9345 0.9366 0.9363 0.9379 0.9363 0.9361 0.9835 0.9444 0.8968 0.9790 0.9501 0.9962 0.9004 0.8486 0.9360 0.8768 0.8916 0.9450 1.0
0.0021 29.0 29000 0.3779 0.9395 0.9475 0.9375 0.9414 0.9395 0.9412 0.9395 0.9393 0.9793 0.9596 0.9083 0.9711 0.9612 0.9962 0.8897 0.8328 0.9458 0.8877 0.9113 0.9450 1.0
0.002 30.0 30000 0.3918 0.9333 0.9394 0.9315 0.9339 0.9333 0.9356 0.9333 0.9330 0.9814 0.9697 0.8739 0.9711 0.9252 0.9962 0.9004 0.8170 0.9212 0.8841 0.9113 0.9633 0.9945
0.0022 31.0 31000 0.3868 0.9373 0.9436 0.9349 0.9382 0.9373 0.9392 0.9373 0.9372 0.9855 0.9613 0.9054 0.9711 0.9280 0.9962 0.8932 0.8423 0.9409 0.8841 0.9064 0.9450 0.9945
0.0017 32.0 32000 0.4050 0.9325 0.9417 0.9309 0.9351 0.9325 0.9344 0.9325 0.9322 0.9814 0.9545 0.8854 0.9738 0.9363 0.9962 0.8861 0.8233 0.9261 0.8877 0.8966 0.9541 1.0
0.001 33.0 33000 0.3841 0.9375 0.9461 0.9346 0.9393 0.9375 0.9393 0.9375 0.9373 0.9835 0.9613 0.9054 0.9738 0.9446 0.9962 0.8932 0.8360 0.9212 0.8841 0.9113 0.9450 0.9945
0.0019 34.0 34000 0.4004 0.9337 0.9401 0.9312 0.9341 0.9337 0.9362 0.9337 0.9335 0.9835 0.9596 0.8997 0.9711 0.9363 0.9962 0.8932 0.8139 0.9212 0.8841 0.9015 0.9450 1.0
0.0027 35.0 35000 0.3941 0.935 0.9407 0.9329 0.9353 0.935 0.9374 0.935 0.9349 0.9814 0.9562 0.9140 0.9711 0.9307 0.9962 0.8897 0.8328 0.9212 0.8841 0.9015 0.9541 0.9945
0.0 36.0 36000 0.3833 0.9395 0.9470 0.9373 0.9411 0.9395 0.9411 0.9395 0.9393 0.9814 0.9630 0.9083 0.9685 0.9501 0.9962 0.9146 0.8265 0.9409 0.8841 0.9113 0.9450 0.9945
0.001 37.0 37000 0.3958 0.9357 0.9441 0.9341 0.9379 0.9357 0.9376 0.9357 0.9355 0.9855 0.9613 0.8883 0.9711 0.9363 0.9962 0.9004 0.8297 0.9212 0.8841 0.9064 0.9633 1.0
0.0012 38.0 38000 0.3790 0.938 0.9421 0.9347 0.9372 0.938 0.9398 0.938 0.9378 0.9876 0.9613 0.9112 0.9685 0.9446 0.9962 0.9004 0.8391 0.9163 0.8804 0.9064 0.9450 0.9945
0.0008 39.0 39000 0.3849 0.9363 0.9431 0.9348 0.9379 0.9363 0.9378 0.9363 0.9360 0.9835 0.9596 0.8968 0.9711 0.9363 0.9962 0.8932 0.8360 0.9409 0.8841 0.8966 0.9633 0.9945
0.0023 40.0 40000 0.3834 0.9383 0.9418 0.9364 0.9378 0.9383 0.9399 0.9383 0.9380 0.9855 0.9646 0.8911 0.9711 0.9418 0.9962 0.9181 0.8265 0.9212 0.8877 0.9113 0.9633 0.9945
0.0013 41.0 41000 0.3837 0.9387 0.9464 0.9364 0.9404 0.9387 0.9404 0.9387 0.9386 0.9835 0.9630 0.9054 0.9685 0.9474 0.9962 0.9039 0.8297 0.9409 0.8841 0.9113 0.9450 0.9945
0.0007 42.0 42000 0.3959 0.9353 0.9414 0.9341 0.9364 0.9353 0.9372 0.9353 0.9350 0.9814 0.9630 0.8854 0.9711 0.9335 0.9962 0.9004 0.8297 0.9212 0.8877 0.9064 0.9725 0.9945
0.0006 43.0 43000 0.3876 0.938 0.9454 0.9362 0.9397 0.938 0.9398 0.938 0.9378 0.9835 0.9630 0.9026 0.9711 0.9418 0.9962 0.8968 0.8265 0.9409 0.8877 0.9113 0.9541 0.9945
0.0016 44.0 44000 0.3886 0.936 0.9432 0.9347 0.9378 0.936 0.9378 0.936 0.9358 0.9814 0.9613 0.8997 0.9711 0.9335 0.9962 0.8932 0.8328 0.9212 0.8877 0.9064 0.9725 0.9945
0.0016 45.0 45000 0.3989 0.9375 0.9457 0.9354 0.9394 0.9375 0.9395 0.9375 0.9373 0.9855 0.9613 0.9112 0.9685 0.9391 0.9962 0.9004 0.8170 0.9409 0.8841 0.9113 0.9450 1.0
0.0 46.0 46000 0.3869 0.9383 0.9458 0.9363 0.9400 0.9383 0.9399 0.9383 0.9381 0.9835 0.9630 0.9054 0.9711 0.9391 0.9962 0.9004 0.8328 0.9409 0.8841 0.9064 0.9541 0.9945
0.0009 47.0 47000 0.3955 0.937 0.9448 0.9353 0.9389 0.937 0.9389 0.937 0.9368 0.9814 0.9630 0.8997 0.9711 0.9391 0.9962 0.8932 0.8265 0.9409 0.8877 0.9113 0.9541 0.9945
0.0019 48.0 48000 0.3959 0.9375 0.9452 0.9359 0.9395 0.9375 0.9393 0.9375 0.9373 0.9814 0.9630 0.9026 0.9711 0.9391 0.9962 0.8932 0.8265 0.9409 0.8877 0.9163 0.9541 0.9945
0.0011 49.0 49000 0.3951 0.9377 0.9454 0.9361 0.9397 0.9377 0.9396 0.9377 0.9376 0.9814 0.9630 0.9026 0.9711 0.9391 0.9962 0.8932 0.8297 0.9409 0.8877 0.9163 0.9541 0.9945
0.0006 50.0 50000 0.3958 0.9377 0.9454 0.9361 0.9397 0.9377 0.9396 0.9377 0.9376 0.9814 0.9630 0.9026 0.9711 0.9391 0.9962 0.8932 0.8297 0.9409 0.8877 0.9163 0.9541 0.9945

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.6.0+cu124
  • Datasets 3.3.0
  • Tokenizers 0.21.0
Downloads last month
5
Safetensors
Model size
150M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for marcelovidigal/ModernBERT-base-2-contract-sections-classification-v4-50-512

Finetuned
(511)
this model