Log-Analysis-Model-DistilBert
This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0453
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
0.0736 | 0.0982 | 500 | 0.0615 |
0.0561 | 0.1964 | 1000 | 0.0625 |
0.0547 | 0.2946 | 1500 | 0.0549 |
0.0655 | 0.3929 | 2000 | 0.0593 |
0.0605 | 0.4911 | 2500 | 0.0541 |
0.0739 | 0.5893 | 3000 | 0.0547 |
0.0474 | 0.6875 | 3500 | 0.0629 |
0.051 | 0.7857 | 4000 | 0.0563 |
0.0758 | 0.8839 | 4500 | 0.0607 |
0.0676 | 0.9821 | 5000 | 0.0509 |
0.0645 | 1.0803 | 5500 | 0.0564 |
0.0531 | 1.1786 | 6000 | 0.0561 |
0.0409 | 1.2768 | 6500 | 0.0596 |
0.0297 | 1.3750 | 7000 | 0.0703 |
0.058 | 1.4732 | 7500 | 0.0613 |
0.0486 | 1.5714 | 8000 | 0.0532 |
0.0459 | 1.6696 | 8500 | 0.0599 |
0.0846 | 1.7678 | 9000 | 0.0583 |
0.0586 | 1.8660 | 9500 | 0.0560 |
0.099 | 1.9643 | 10000 | 0.0503 |
0.0576 | 2.0625 | 10500 | 0.0573 |
0.049 | 2.1607 | 11000 | 0.0505 |
0.0489 | 2.2589 | 11500 | 0.0490 |
0.0611 | 2.3571 | 12000 | 0.0494 |
0.056 | 2.4553 | 12500 | 0.0476 |
0.03 | 2.5535 | 13000 | 0.0540 |
0.0536 | 2.6517 | 13500 | 0.0478 |
0.0752 | 2.7500 | 14000 | 0.0521 |
0.0476 | 2.8482 | 14500 | 0.0590 |
0.0402 | 2.9464 | 15000 | 0.0601 |
0.041 | 3.0446 | 15500 | 0.0520 |
0.053 | 3.1428 | 16000 | 0.0480 |
0.0315 | 3.2410 | 16500 | 0.0494 |
0.0326 | 3.3392 | 17000 | 0.0511 |
0.044 | 3.4374 | 17500 | 0.0520 |
0.0681 | 3.5357 | 18000 | 0.0467 |
0.0406 | 3.6339 | 18500 | 0.0479 |
0.0505 | 3.7321 | 19000 | 0.0480 |
0.0539 | 3.8303 | 19500 | 0.0453 |
0.025 | 3.9285 | 20000 | 0.0504 |
0.0598 | 4.0267 | 20500 | 0.0477 |
0.039 | 4.1249 | 21000 | 0.0498 |
0.0474 | 4.2231 | 21500 | 0.0494 |
0.037 | 4.3214 | 22000 | 0.0489 |
0.0303 | 4.4196 | 22500 | 0.0503 |
0.0545 | 4.5178 | 23000 | 0.0485 |
0.0466 | 4.6160 | 23500 | 0.0484 |
0.0461 | 4.7142 | 24000 | 0.0478 |
0.0478 | 4.8124 | 24500 | 0.0478 |
0.0473 | 4.9106 | 25000 | 0.0477 |
Framework versions
- Transformers 4.40.2
- Pytorch 2.3.0+cpu
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 135
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for teoogherghi/Log-Analysis-Model-DistilBert
Base model
distilbert/distilbert-base-uncased