ONNX model of intfloat / multilingual-e5-large-instruct using unsupported "Onnx ML opset" 5 - can't load

#22
by davidsitsky - opened

Many thanks for creating an Onnx version of this model. Unfortunately unlike your other mulrilingual E5 Onnx models, this seems to have been created with an unsupported "Onnx ML opset 5". This is the error I get trying to load the model:

failed:/onnxruntime_src/onnxruntime/core/graph/model_load_utils.h:46 void onnxruntime::model_load_utils::ValidateOpsetForDomain(const std::unordered_map<std::__cxx11::basic_string<char>, int>&, const onnxruntime::logging::Logger&, bool, const std::string&, int) ONNX Runtime only *guarantees* support for models stamped with official released onnx opset versions. Opset 5 is under development and support for this is limited. The operator schemas and or other functionality may change before next ONNX release and in this case ONNX Runtime will not guarantee backward compatibility. Current official support for domain ai.onnx.ml is till opset 4.

According (at the time of writing) to https://onnxruntime.ai/docs/reference/compatibility.html, the latest version of OnnxRuntime is 1.18, and the ML opset version that is supported is 4.

Can the model be re-uploaded with the right opset levels please? Otherwise we can't use this in production.

Thank you - and apologies for the inconvenience.

@intfloat - any ideas with this?

Owner

Hi, sorry that I am not an expert on ONNX, this export was done by https://huggingface.co/spaces/onnx/export automatically.

Can you like fork this repository and replace it with your own ONNX version?

My apologies.. I think I got my wires crossed here. I think the DJL model zoo is the one at fault here, but not using these directly, but its own conversion which seems to have this issue.

davidsitsky changed discussion status to closed

Sign up or log in to comment