Model Card for Model mistral7b-v0.3-ultrachat200k

Trained using LoRA with r=32 for 1 epoch on 208k examples of chats from HuggingFaceH4/ultrachat_200k with max_seq_len 16384.

Downloads last month
42
Safetensors
Model size
7.25B params
Tensor type
BF16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support