Based on https://huggingface.co/microsoft/Phi-3-mini-instruct

Convert ONNX model by using https://github.com/microsoft/onnxruntime-genai

Using command: python -m onnxruntime_genai.models.builder -m microsoft/Phi-3-mini-instruct -o Phi-3-mini-instruct-onnx -e webgpu -c cache-dir -p int4 --extra_options int4_block_size=32 int4_accuracy_level=4

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support