Anticipated Availability of GPTQModel Format Models (W4A16/W8A16)
#22
by
X-SZM
- opened
While quantization formats such as GGUF and AWQ are already available in the community, models in GPTQModel formats like W4A16 and W8A16 remain notably scarce. These formats demonstrate significant advantages in quantization effectiveness and reduced precision loss. Given that quantizing models using GPTQModel formats requires data calibration—a process challenging for general users to complete—it would be beneficial for organized community groups to share models in W4A16 and W8A16 formats. Such contributions would be highly anticipated.
X-SZM
changed discussion title from
Anticipated Availability of LLM-Compressor Format Models (W4A16/W8A16)
to Anticipated Availability of GPTQModel Format Models (W4A16/W8A16)