kvaishnavi
commited on
Commit
•
d70a247
1
Parent(s):
081faae
Update README.md
Browse files
README.md
CHANGED
@@ -5,8 +5,8 @@ tags: [ONNX, DML, ONNXRuntime, phi3, nlp, conversational, custom_code]
|
|
5 |
inference: false
|
6 |
---
|
7 |
|
8 |
-
# Phi-3.5-
|
9 |
-
This repository hosts the optimized versions of [Phi-3.5-mini-
|
10 |
Optimized Phi-3.5 Mini models are published here in [ONNX](https://onnx.ai) format to run with [ONNX Runtime](https://onnxruntime.ai/) on CPU and GPU across devices, including server platforms, Windows, Linux and Mac desktops, and mobile CPUs, with the precision best suited to each of these targets.
|
11 |
|
12 |
To easily get started with Phi-3.5, you can use our newly introduced ONNX Runtime Generate() API. See [here](https://aka.ms/generate-tutorial) for instructions on how to run it.
|
|
|
5 |
inference: false
|
6 |
---
|
7 |
|
8 |
+
# Phi-3.5-Mini-Instruct ONNX models
|
9 |
+
This repository hosts the optimized versions of [Phi-3.5-mini-instruct](https://huggingface.co/microsoft/Phi-3.5-mini-instruct) to accelerate inference with ONNX Runtime.
|
10 |
Optimized Phi-3.5 Mini models are published here in [ONNX](https://onnx.ai) format to run with [ONNX Runtime](https://onnxruntime.ai/) on CPU and GPU across devices, including server platforms, Windows, Linux and Mac desktops, and mobile CPUs, with the precision best suited to each of these targets.
|
11 |
|
12 |
To easily get started with Phi-3.5, you can use our newly introduced ONNX Runtime Generate() API. See [here](https://aka.ms/generate-tutorial) for instructions on how to run it.
|