please update to Mistral-Small-3.2-24B-Instruct-2506

#5
by celsowm - opened

please update to Mistral-Small-3.2-24B-Instruct-2506

Well. Not yet supported for AWQ. VLMs seem to be a problem.

The FP8 part failed me when trying to use llm-compressor.

Still experimental: https://github.com/vllm-project/llm-compressor/tree/main/experimental/mistral

Sign up or log in to comment