This model was converted to the OpenVINO IR format using the following command:

optimum-cli export openvino -m "input/path" --task text-generation-with-past --weight-format int4 --ratio 1 --group-size 128 --dataset wikitext2 --awq --scale-estimation --sensitivity-metric weight_quantization_error "output/path"
Downloads last month
6
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Echo9Zulu/Meta-Llama-3.1-8B-SurviveV3-int4_asym-awq-se-wqe-ov

Finetuned
(2)
this model

Collection including Echo9Zulu/Meta-Llama-3.1-8B-SurviveV3-int4_asym-awq-se-wqe-ov