Chatbot-009-3000
Collection
Estos modelos fueron entrenado con un dataset de 3000.
•
7 items
•
Updated
This model is a fine-tuned version of mistralai/Mistral-Small-24B-Instruct-2501 on the None dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.2282 | 0.8658 | 100 | 1.2312 |
1.0274 | 1.7273 | 200 | 1.0113 |
0.7934 | 2.5887 | 300 | 0.7905 |
0.6052 | 3.4502 | 400 | 0.6743 |
0.5425 | 4.3117 | 500 | 0.5958 |
0.4601 | 5.1732 | 600 | 0.5434 |
0.479 | 6.0346 | 700 | 0.5061 |
0.4181 | 6.9004 | 800 | 0.4801 |
0.4051 | 7.7619 | 900 | 0.4624 |
0.3994 | 8.6234 | 1000 | 0.4534 |
0.3459 | 9.4848 | 1100 | 0.4463 |
Base model
mistralai/Mistral-Small-24B-Base-2501