Chatbot-009-3000
Collection
Estos modelos fueron entrenado con un dataset de 3000.
•
7 items
•
Updated
This model is a fine-tuned version of meta-llama/Llama-3.1-8B-Instruct on the None dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
0.5365 | 0.9927 | 102 | 0.5637 |
0.434 | 1.9927 | 204 | 0.4885 |
0.4591 | 2.9927 | 306 | 0.4375 |
0.4113 | 3.9927 | 408 | 0.3951 |
0.3493 | 4.9927 | 510 | 0.3581 |
0.3255 | 5.9927 | 612 | 0.3291 |
0.2947 | 6.9927 | 714 | 0.3083 |
0.2668 | 7.9927 | 816 | 0.2944 |
0.2583 | 8.9927 | 918 | 0.2859 |
0.2388 | 9.9927 | 1020 | 0.2833 |
Base model
meta-llama/Llama-3.1-8B