--- base_model: Steelskull/L3.3-Nevoria-R1-70b library_name: transformers quantized_by: Sinensis pipeline_tag: text-generation tags: - mergekit - merge --- ## AWQ quantization of [Steelskull/L3.3-Nevoria-R1-70b](https://huggingface.co/Steelskull/L3.3-Nevoria-R1-70b) "quantization_config": { "bits": 4, "group_size": 128, "modules_to_not_convert": null, "quant_method": "awq", "version": "gemm", "zero_point": true }