medgemma-27b-it-fp8-static / quantization_config.json
tachytelicdetonation's picture
Upload folder using huggingface_hub
8ded231 verified
{
"quantization": "fp8",
"quantization_method": "static",
"original_model": "google/medgemma-27b-text-it",
"quantization_library": "llm-compressor",
"notes": "Quantized using medical-specific calibration data"
}