Chatbot-009-3000
Collection
Estos modelos fueron entrenado con un dataset de 3000.
•
7 items
•
Updated
This model is a fine-tuned version of mistralai/Ministral-8B-Instruct-2410 on an unknown dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.2576 | 0.8658 | 100 | 1.2716 |
1.0967 | 1.7273 | 200 | 1.0722 |
0.9321 | 2.5887 | 300 | 0.9199 |
0.755 | 3.4502 | 400 | 0.8018 |
0.6895 | 4.3117 | 500 | 0.7204 |
0.5723 | 5.1732 | 600 | 0.6567 |
0.5696 | 6.0346 | 700 | 0.6137 |
0.5127 | 6.9004 | 800 | 0.5841 |
0.4962 | 7.7619 | 900 | 0.5562 |
0.4982 | 8.6234 | 1000 | 0.5444 |
0.4259 | 9.4848 | 1100 | 0.5345 |
Base model
mistralai/Ministral-8B-Instruct-2410