metadata
license: apache-2.0
base_model:
- Qwen/Qwen3-8B
tags:
- autoround
This is Qwen/Qwen3-8B quantized with AutoRound in 2-bit (symmetric + gptq format). The model has been created, tested, and evaluated by The Kaitchup. The model is compatible with vLLM and Transformers.
More details in this article: How Well Does Qwen3 Handle 4-bit and 2-bit Quantization?
- Developed by: The Kaitchup
- License: Apache 2.0 license
How to Support My Work
Subscribe to The Kaitchup. This helps me a lot to continue quantizing and evaluating models for free.