This is a 5.0bpw/h8 quantized version of huihui-ai/QwQ-32B-abliterated using exllamav2 with this PR applied.
Base model