--- base_model: - Nitral-AI/Violet_Magcap-12B base_model_relation: quantized tags: - exl3 license: other --- ## Quantized using the default exllamav3 (0.0.4) quantization process. - Original model: [Nitral-AI/Violet_Magcap-12B](https://huggingface.co/Nitral-AI/Violet_Magcap-12B) - refer for more details on the model. - exllamav3: https://github.com/turboderp-org/exllamav3 --- EXL3 quants available: - 3.5bpw, 4.0bpw, 5bpw, 6.0bpw - Go to "Files and versions", then click on "Main" to choose your quant - download command example 'git clone -b 4.0bpw https://huggingface.co/s1arsky/Violet_Magcap-12B-EXL3'