Mistral-7B-v0.1 4-bit quantized MLX

Downloads last month
22
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support