The best checkpoint is 160-epoch.

Downloads last month
6
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for WHATX/30k-Mistral-7B-Instruct-v0.3-small_301

Adapter
(238)
this model

Collection including WHATX/30k-Mistral-7B-Instruct-v0.3-small_301