metadata
license: apache-2.0
license_link: https://huggingface.co/Qwen/Qwen2.5-7B-Instruct-1M/blob/main/LICENSE
language:
- en
pipeline_tag: text-generation
base_model: Qwen/Qwen2.5-7B-Instruct-1M
tags:
- chat
- openvino
- openvino-export
library_name: transformers
This model was converted to OpenVINO from Qwen/Qwen2.5-7B-Instruct-1M
using optimum-intel
via the export space.
First make sure you have optimum-intel installed:
pip install optimum[openvino]
To load your model you can do as follows:
from optimum.intel import OVModelForCausalLM
model_id = "lordpsarris/Qwen2.5-7B-Instruct-1M-openvino"
model = OVModelForCausalLM.from_pretrained(model_id)