Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | @@ -17,7 +17,8 @@ The provided OpenVINO™ IR model is compatible with: | |
| 17 | 
             
            * OpenVINO version 2024.1.0 and higher
         | 
| 18 | 
             
            * Optimum Intel 1.16.0 and higher
         | 
| 19 |  | 
| 20 | 
            -
            ## Running Model Inference
         | 
|  | |
| 21 |  | 
| 22 | 
             
            1. Install packages required for using [Optimum Intel](https://huggingface.co/docs/optimum/intel/index) integration with the OpenVINO backend:
         | 
| 23 |  | 
| @@ -44,6 +45,37 @@ print(text) | |
| 44 |  | 
| 45 | 
             
            For more examples and possible optimizations, refer to the [OpenVINO Large Language Model Inference Guide](https://docs.openvino.ai/2024/learn-openvino/llm_inference_guide.html).
         | 
| 46 |  | 
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
|  | |
| 47 | 
             
            ## Limitations
         | 
| 48 |  | 
| 49 | 
             
            Check the original model card for [limitations](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2#limitations).
         | 
|  | |
| 17 | 
             
            * OpenVINO version 2024.1.0 and higher
         | 
| 18 | 
             
            * Optimum Intel 1.16.0 and higher
         | 
| 19 |  | 
| 20 | 
            +
            ## Running Model Inference with [Optimum Intel](https://huggingface.co/docs/optimum/intel/index)
         | 
| 21 | 
            +
             | 
| 22 |  | 
| 23 | 
             
            1. Install packages required for using [Optimum Intel](https://huggingface.co/docs/optimum/intel/index) integration with the OpenVINO backend:
         | 
| 24 |  | 
|  | |
| 45 |  | 
| 46 | 
             
            For more examples and possible optimizations, refer to the [OpenVINO Large Language Model Inference Guide](https://docs.openvino.ai/2024/learn-openvino/llm_inference_guide.html).
         | 
| 47 |  | 
| 48 | 
            +
            ## Running Model Inference with [OpenVINO GenAI](https://github.com/openvinotoolkit/openvino.genai)
         | 
| 49 | 
            +
             | 
| 50 | 
            +
            1. Install packages required for using OpenVINO GenAI.
         | 
| 51 | 
            +
            ```
         | 
| 52 | 
            +
            pip install openvino-genai huggingface_hub
         | 
| 53 | 
            +
            ```
         | 
| 54 | 
            +
             | 
| 55 | 
            +
            2. Download model from HuggingFace Hub
         | 
| 56 | 
            +
               
         | 
| 57 | 
            +
            ```
         | 
| 58 | 
            +
            import huggingface_hub as hf_hub
         | 
| 59 | 
            +
             | 
| 60 | 
            +
            model_id = "OpenVINO/Mistral-7B-Instruct-v0.2-int8-ov"
         | 
| 61 | 
            +
            model_path = "Mistral-7B-Instruct-v0.2-int8-ov"
         | 
| 62 | 
            +
             | 
| 63 | 
            +
            hf_hub.snapshot_download(model_id, local_dir=model_path)
         | 
| 64 | 
            +
             | 
| 65 | 
            +
            ```
         | 
| 66 | 
            +
             | 
| 67 | 
            +
            3. Run model inference:
         | 
| 68 | 
            +
             | 
| 69 | 
            +
            ```
         | 
| 70 | 
            +
            import openvino_genai as ov_genai
         | 
| 71 | 
            +
             | 
| 72 | 
            +
            device = "CPU"
         | 
| 73 | 
            +
            pipe = ov_genai.LLMPipeline(model_path, device)
         | 
| 74 | 
            +
            print(pipe.generate("What is OpenVINO?"))
         | 
| 75 | 
            +
            ```
         | 
| 76 | 
            +
             | 
| 77 | 
            +
            More GenAI usage examples can be found in OpenVINO GenAI library [docs](https://github.com/openvinotoolkit/openvino.genai/blob/master/src/README.md) and [samples](https://github.com/openvinotoolkit/openvino.genai?tab=readme-ov-file#openvino-genai-samples)
         | 
| 78 | 
            +
             | 
| 79 | 
             
            ## Limitations
         | 
| 80 |  | 
| 81 | 
             
            Check the original model card for [limitations](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2#limitations).
         | 

