kvaishnavi
commited on
Commit
•
00a243a
1
Parent(s):
12ac3cd
Upload Phi-3-vision-128k-instruct ONNX models
Browse files- README.md +94 -0
- config.json +148 -0
- directml-int4-rtn-block-32/genai_config.json +77 -0
- directml-int4-rtn-block-32/model.onnx +3 -0
- directml-int4-rtn-block-32/model.onnx.data +3 -0
- directml-int4-rtn-block-32/phi-3-v-128k-instruct-text-embedding.onnx +3 -0
- directml-int4-rtn-block-32/phi-3-v-128k-instruct-text-embedding.onnx.data +3 -0
- directml-int4-rtn-block-32/phi-3-v-128k-instruct-vision.onnx +3 -0
- directml-int4-rtn-block-32/phi-3-v-128k-instruct-vision.onnx.data +3 -0
- directml-int4-rtn-block-32/processor_config.json +35 -0
- directml-int4-rtn-block-32/special_tokens_map.json +36 -0
- directml-int4-rtn-block-32/tokenizer.json +0 -0
- directml-int4-rtn-block-32/tokenizer_config.json +407 -0
README.md
ADDED
@@ -0,0 +1,94 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: mit
|
3 |
+
tags:
|
4 |
+
- ONNX
|
5 |
+
- DML
|
6 |
+
- ONNXRuntime
|
7 |
+
- phi3
|
8 |
+
- custom_code
|
9 |
+
---
|
10 |
+
|
11 |
+
# Phi-3 Vision-128k-Instruct ONNX DirectML models
|
12 |
+
|
13 |
+
<!-- Provide a quick summary of what the model is/does. -->
|
14 |
+
This repository hosts the optimized versions of [Phi-3-vision-128k-instruct](https://aka.ms/phi3-vision-128k-instruct) to accelerate inference with DirectML and ONNX Runtime for your machines with GPUs.
|
15 |
+
|
16 |
+
Phi-3 Vision is a lightweight, state-of-the-art open multimodal model built upon datasets that include synthetic data and filtered publicly available web data with a focus on very high-quality, reasoning dense data both on text and vision. The model belongs to the Phi-3 model family, and the multimodal version supports up to 128K context length (in tokens). The base model has undergone a rigorous enhancement process, incorporating both supervised fine-tuning and direct preference optimization, to ensure precise instruction adherence and robust safety measures.
|
17 |
+
|
18 |
+
Optimized variants of the Phi-3 Vision models are published here in [ONNX](https://onnx.ai) format to run with [ONNX Runtime](https://onnxruntime.ai/) on CPU and GPU across devices, including server platforms, Windows, Linux and Mac desktops, and mobile CPUs, with the precision best suited to each of these targets.
|
19 |
+
|
20 |
+
DirectML support lets developers bring hardware acceleration to Windows devices at scale across AMD, Intel, and NVIDIA GPUs. Along with DirectML, ONNX Runtime provides cross platform support for Phi-3 models across a range of devices for CPU and GPU.
|
21 |
+
|
22 |
+
## ONNX Models
|
23 |
+
|
24 |
+
Here are some of the optimized configurations we have added:
|
25 |
+
|
26 |
+
ONNX model for INT4 DML: ONNX model optimized to run with DirectML and quantized to int4 precision using RTN.
|
27 |
+
|
28 |
+
How do you know which is the best ONNX model for you:
|
29 |
+
|
30 |
+
- Are you on a Windows machine with GPU?
|
31 |
+
- I don't know → Review this [guide](https://www.microsoft.com/en-us/windows/learning-center/how-to-check-gpu) to see whether you have a GPU in your Windows machine.
|
32 |
+
- Yes → Access the Hugging Face DirectML ONNX models and instructions at [Phi-3-vision-128k-instruct-onnx-directml](https://huggingface.co/microsoft/Phi-3-vision-128k-instruct-onnx-directml).
|
33 |
+
- No → Do you have a NVIDIA GPU?
|
34 |
+
- I don't know → Review this [guide](https://docs.nvidia.com/cuda/cuda-installation-guide-microsoft-windows/index.html#verify-you-have-a-cuda-capable-gpu) to see whether you have a CUDA-capable GPU.
|
35 |
+
- Yes → Access the Hugging Face CUDA ONNX models and instructions at [Phi-3-vision-128k-instruct-onnx-cuda](https://huggingface.co/microsoft/Phi-3-vision-128k-instruct-onnx-cuda) for NVIDIA GPUs.
|
36 |
+
- No → Access the Hugging Face ONNX models for CPU devices and instructions at [Phi-3-vision-128k-instruct-onnx-cpu](https://huggingface.co/microsoft/Phi-3-vision-128k-instruct-onnx-cpu).
|
37 |
+
|
38 |
+
## How to Get Started with the Model
|
39 |
+
To support the Phi-3 models across a range of devices, platforms, and EP backends, we introduce a new API to wrap several aspects of generative AI inferencing. This API makes it easy to drag and drop LLMs straight into your app. To run the early version of these models with ONNX, follow the steps [here](https://aka.ms/run-phi3-v-onnx). You can also test this with a chat app.
|
40 |
+
|
41 |
+
## Hardware Supported
|
42 |
+
|
43 |
+
The model has been tested on:
|
44 |
+
- GPU SKU: RTX 4080 (DirectML)
|
45 |
+
|
46 |
+
Minimum Configuration Required:
|
47 |
+
- Windows: DirectX 12-capable GPU and a minimum of 10GB of combined RAM
|
48 |
+
|
49 |
+
### Model Description
|
50 |
+
|
51 |
+
- **Developed by:** Microsoft
|
52 |
+
- **Model type:** ONNX
|
53 |
+
- **Language(s) (NLP):** Python, C, C++
|
54 |
+
- **License:** MIT
|
55 |
+
- **Model Description:** This is a conversion of the Phi-3 Vision-128K-Instruct model for ONNX Runtime inference.
|
56 |
+
|
57 |
+
## Additional Details
|
58 |
+
- [**Phi-3 Small, Medium, and Vision Blog**](https://aka.ms/phi3_ONNXBuild24) and [**Phi-3 Mini Blog**](https://aka.ms/phi3-optimizations)
|
59 |
+
- [**Phi-3 Model Blog Link**](https://aka.ms/phi3blog-april)
|
60 |
+
- [**Phi-3 Model Card**](https://aka.ms/phi3-vision-128k-instruct)
|
61 |
+
- [**Phi-3 Technical Report**](https://aka.ms/phi3-tech-report)
|
62 |
+
- [**Phi-3 on Azure AI Studio**](https://aka.ms/phi3-azure-ai)
|
63 |
+
|
64 |
+
## Performance Metrics
|
65 |
+
The performance of the ONNX vision model is similar to [Phi-3-mini-128k-instruct-onnx](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct-onnx) during token generation.
|
66 |
+
|
67 |
+
## Base Model Usage and Considerations
|
68 |
+
**Primary use cases**
|
69 |
+
|
70 |
+
The model is intended for broad commercial and research use in English. The model provides uses for general purpose AI systems and applications with visual and text input capabilities which require
|
71 |
+
|
72 |
+
1) memory/compute constrained environments;
|
73 |
+
2) latency bound scenarios;
|
74 |
+
3) general image understanding;
|
75 |
+
4) OCR;
|
76 |
+
5) chart and table understanding.
|
77 |
+
|
78 |
+
Our model is designed to accelerate research on efficient language and multimodal models, for use as a building block for generative AI powered features.
|
79 |
+
|
80 |
+
**Use case considerations**
|
81 |
+
|
82 |
+
Our models are not specifically designed or evaluated for all downstream purposes. Developers should consider common limitations of language models as they select use cases, and evaluate and mitigate for accuracy, safety, and fairness before using within a specific downstream use case, particularly for high-risk scenarios.
|
83 |
+
|
84 |
+
Developers should be aware of and adhere to applicable laws or regulations (including privacy, trade compliance laws, etc.) that are relevant to their use case.
|
85 |
+
|
86 |
+
Nothing contained in this Model Card should be interpreted as or deemed a restriction or modification to the license the model is released under.
|
87 |
+
|
88 |
+
## Appendix
|
89 |
+
|
90 |
+
## Model Card Contact
|
91 |
+
parinitarahi, kvaishnavi, natke
|
92 |
+
|
93 |
+
## Contributors
|
94 |
+
Kunal Vaishnavi, Sunghoon Choi, Yufeng Li, Baiju Meswani, Sheetal Arun Kadam, Rui Ren, Natalie Kershaw, Parinita Rahi, Patrice Vignola, Xiang Zhang, Chai Chaoweeraprasit, Logan Iyer, Vicente Rivera, Jacques Van Rhyn
|
config.json
ADDED
@@ -0,0 +1,148 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "Phi-3-vision-128k-instruct",
|
3 |
+
"architectures": [
|
4 |
+
"Phi3VForCausalLM"
|
5 |
+
],
|
6 |
+
"attention_dropout": 0.0,
|
7 |
+
"auto_map": {
|
8 |
+
"AutoConfig": "configuration_phi3_v.Phi3VConfig",
|
9 |
+
"AutoModelForCausalLM": "modeling_phi3_v.Phi3VForCausalLM"
|
10 |
+
},
|
11 |
+
"bos_token_id": 1,
|
12 |
+
"embd_layer": {
|
13 |
+
"embedding_cls": "image",
|
14 |
+
"hd_transform_order": "sub_glb",
|
15 |
+
"projection_cls": "mlp",
|
16 |
+
"use_hd_transform": true,
|
17 |
+
"with_learnable_separator": true
|
18 |
+
},
|
19 |
+
"eos_token_id": 2,
|
20 |
+
"hidden_act": "silu",
|
21 |
+
"hidden_size": 3072,
|
22 |
+
"img_processor": {
|
23 |
+
"image_dim_out": 1024,
|
24 |
+
"model_name": "openai/clip-vit-large-patch14-336",
|
25 |
+
"name": "clip_vision_model",
|
26 |
+
"num_img_tokens": 144
|
27 |
+
},
|
28 |
+
"initializer_range": 0.02,
|
29 |
+
"intermediate_size": 8192,
|
30 |
+
"max_position_embeddings": 131072,
|
31 |
+
"model_type": "phi3_v",
|
32 |
+
"num_attention_heads": 32,
|
33 |
+
"num_hidden_layers": 32,
|
34 |
+
"num_key_value_heads": 32,
|
35 |
+
"original_max_position_embeddings": 4096,
|
36 |
+
"rms_norm_eps": 1e-05,
|
37 |
+
"rope_scaling": {
|
38 |
+
"long_factor": [
|
39 |
+
1.0299999713897705,
|
40 |
+
1.0499999523162842,
|
41 |
+
1.0499999523162842,
|
42 |
+
1.0799999237060547,
|
43 |
+
1.2299998998641968,
|
44 |
+
1.2299998998641968,
|
45 |
+
1.2999999523162842,
|
46 |
+
1.4499999284744263,
|
47 |
+
1.5999999046325684,
|
48 |
+
1.6499998569488525,
|
49 |
+
1.8999998569488525,
|
50 |
+
2.859999895095825,
|
51 |
+
3.68999981880188,
|
52 |
+
5.419999599456787,
|
53 |
+
5.489999771118164,
|
54 |
+
5.489999771118164,
|
55 |
+
9.09000015258789,
|
56 |
+
11.579999923706055,
|
57 |
+
15.65999984741211,
|
58 |
+
15.769999504089355,
|
59 |
+
15.789999961853027,
|
60 |
+
18.360000610351562,
|
61 |
+
21.989999771118164,
|
62 |
+
23.079999923706055,
|
63 |
+
30.009998321533203,
|
64 |
+
32.35000228881836,
|
65 |
+
32.590003967285156,
|
66 |
+
35.56000518798828,
|
67 |
+
39.95000457763672,
|
68 |
+
53.840003967285156,
|
69 |
+
56.20000457763672,
|
70 |
+
57.95000457763672,
|
71 |
+
59.29000473022461,
|
72 |
+
59.77000427246094,
|
73 |
+
59.920005798339844,
|
74 |
+
61.190006256103516,
|
75 |
+
61.96000671386719,
|
76 |
+
62.50000762939453,
|
77 |
+
63.3700065612793,
|
78 |
+
63.48000717163086,
|
79 |
+
63.48000717163086,
|
80 |
+
63.66000747680664,
|
81 |
+
63.850006103515625,
|
82 |
+
64.08000946044922,
|
83 |
+
64.760009765625,
|
84 |
+
64.80001068115234,
|
85 |
+
64.81001281738281,
|
86 |
+
64.81001281738281
|
87 |
+
],
|
88 |
+
"short_factor": [
|
89 |
+
1.05,
|
90 |
+
1.05,
|
91 |
+
1.05,
|
92 |
+
1.1,
|
93 |
+
1.1,
|
94 |
+
1.1,
|
95 |
+
1.2500000000000002,
|
96 |
+
1.2500000000000002,
|
97 |
+
1.4000000000000004,
|
98 |
+
1.4500000000000004,
|
99 |
+
1.5500000000000005,
|
100 |
+
1.8500000000000008,
|
101 |
+
1.9000000000000008,
|
102 |
+
2.000000000000001,
|
103 |
+
2.000000000000001,
|
104 |
+
2.000000000000001,
|
105 |
+
2.000000000000001,
|
106 |
+
2.000000000000001,
|
107 |
+
2.000000000000001,
|
108 |
+
2.000000000000001,
|
109 |
+
2.000000000000001,
|
110 |
+
2.000000000000001,
|
111 |
+
2.000000000000001,
|
112 |
+
2.000000000000001,
|
113 |
+
2.000000000000001,
|
114 |
+
2.000000000000001,
|
115 |
+
2.000000000000001,
|
116 |
+
2.000000000000001,
|
117 |
+
2.000000000000001,
|
118 |
+
2.000000000000001,
|
119 |
+
2.000000000000001,
|
120 |
+
2.000000000000001,
|
121 |
+
2.1000000000000005,
|
122 |
+
2.1000000000000005,
|
123 |
+
2.2,
|
124 |
+
2.3499999999999996,
|
125 |
+
2.3499999999999996,
|
126 |
+
2.3499999999999996,
|
127 |
+
2.3499999999999996,
|
128 |
+
2.3999999999999995,
|
129 |
+
2.3999999999999995,
|
130 |
+
2.6499999999999986,
|
131 |
+
2.6999999999999984,
|
132 |
+
2.8999999999999977,
|
133 |
+
2.9499999999999975,
|
134 |
+
3.049999999999997,
|
135 |
+
3.049999999999997,
|
136 |
+
3.049999999999997
|
137 |
+
],
|
138 |
+
"type": "su"
|
139 |
+
},
|
140 |
+
"rope_theta": 10000.0,
|
141 |
+
"sliding_window": 131072,
|
142 |
+
"tie_word_embeddings": false,
|
143 |
+
"torch_dtype": "bfloat16",
|
144 |
+
"transformers_version": "4.38.1",
|
145 |
+
"use_cache": true,
|
146 |
+
"vocab_size": 32064,
|
147 |
+
"_attn_implementation": "flash_attention_2"
|
148 |
+
}
|
directml-int4-rtn-block-32/genai_config.json
ADDED
@@ -0,0 +1,77 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"model": {
|
3 |
+
"bos_token_id": 1,
|
4 |
+
"context_length": 131072,
|
5 |
+
"decoder": {
|
6 |
+
"session_options": {
|
7 |
+
"log_id": "onnxruntime-genai",
|
8 |
+
"provider_options": [
|
9 |
+
{
|
10 |
+
"dml": {}
|
11 |
+
}
|
12 |
+
]
|
13 |
+
},
|
14 |
+
"filename": "model.onnx",
|
15 |
+
"head_size": 96,
|
16 |
+
"hidden_size": 3072,
|
17 |
+
"inputs": {
|
18 |
+
"inputs_embeds": "inputs_embeds",
|
19 |
+
"attention_mask": "attention_mask",
|
20 |
+
"past_key_names": "past_key_values.%d.key",
|
21 |
+
"past_value_names": "past_key_values.%d.value"
|
22 |
+
},
|
23 |
+
"outputs": {
|
24 |
+
"logits": "logits",
|
25 |
+
"present_key_names": "present.%d.key",
|
26 |
+
"present_value_names": "present.%d.value"
|
27 |
+
},
|
28 |
+
"num_attention_heads": 32,
|
29 |
+
"num_hidden_layers": 32,
|
30 |
+
"num_key_value_heads": 32
|
31 |
+
},
|
32 |
+
"embedding": {
|
33 |
+
"filename": "phi-3-v-128k-instruct-text-embedding.onnx",
|
34 |
+
"inputs": {
|
35 |
+
"input_ids": "input_ids"
|
36 |
+
},
|
37 |
+
"outputs": {
|
38 |
+
"inputs_embeds": "inputs_embeds"
|
39 |
+
}
|
40 |
+
},
|
41 |
+
"vision": {
|
42 |
+
"filename": "phi-3-v-128k-instruct-vision.onnx",
|
43 |
+
"inputs": {
|
44 |
+
"pixel_values": "pixel_values",
|
45 |
+
"image_sizes": "image_sizes"
|
46 |
+
},
|
47 |
+
"outputs": {
|
48 |
+
"visual_features": "visual_features"
|
49 |
+
}
|
50 |
+
},
|
51 |
+
"eos_token_id": [
|
52 |
+
2,
|
53 |
+
32000,
|
54 |
+
32001,
|
55 |
+
32007
|
56 |
+
],
|
57 |
+
"pad_token_id": 32000,
|
58 |
+
"type": "phi3v",
|
59 |
+
"vocab_size": 32064
|
60 |
+
},
|
61 |
+
"search": {
|
62 |
+
"diversity_penalty": 0.0,
|
63 |
+
"do_sample": false,
|
64 |
+
"early_stopping": true,
|
65 |
+
"length_penalty": 1.0,
|
66 |
+
"max_length": 131072,
|
67 |
+
"min_length": 0,
|
68 |
+
"no_repeat_ngram_size": 0,
|
69 |
+
"num_beams": 1,
|
70 |
+
"num_return_sequences": 1,
|
71 |
+
"past_present_share_buffer": true,
|
72 |
+
"repetition_penalty": 1.0,
|
73 |
+
"temperature": 1.0,
|
74 |
+
"top_k": 1,
|
75 |
+
"top_p": 1.0
|
76 |
+
}
|
77 |
+
}
|
directml-int4-rtn-block-32/model.onnx
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:5ad6888c1368a6e246b1559859979c38afada8958a3251afeffa22fca9e22ca3
|
3 |
+
size 277393
|
directml-int4-rtn-block-32/model.onnx.data
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:36156f3337594e6f015c8984d6a12eb21c996b669729ce02601b761864bc495e
|
3 |
+
size 2119403520
|
directml-int4-rtn-block-32/phi-3-v-128k-instruct-text-embedding.onnx
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:1c7c1cfa5b18a03f0334b02bf774d0e069472ae110aaa786a3b2cc8f37ad15fa
|
3 |
+
size 411
|
directml-int4-rtn-block-32/phi-3-v-128k-instruct-text-embedding.onnx.data
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:bbb5071dffdeee95cf3edda83cbd651f066fea4d13f537f3287d0abb4af20e1c
|
3 |
+
size 197001216
|
directml-int4-rtn-block-32/phi-3-v-128k-instruct-vision.onnx
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:d4b01c2d70bd95692f95402cd88e201565fb73a37fa39a123e92df627df1a84c
|
3 |
+
size 415525
|
directml-int4-rtn-block-32/phi-3-v-128k-instruct-vision.onnx.data
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:5cc72c56fd9fe5bd76bb1deef5757c1b42a37dc695c8f7ff59174d007a296938
|
3 |
+
size 282216448
|
directml-int4-rtn-block-32/processor_config.json
ADDED
@@ -0,0 +1,35 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"processor": {
|
3 |
+
"name": "image_processing",
|
4 |
+
"transforms": [
|
5 |
+
{
|
6 |
+
"operation": {
|
7 |
+
"name": "decode_image",
|
8 |
+
"domain": "com.microsoft.extensions",
|
9 |
+
"type": "DecodeImage",
|
10 |
+
"attrs": {
|
11 |
+
"color_space": "BGR"
|
12 |
+
}
|
13 |
+
}
|
14 |
+
},
|
15 |
+
{
|
16 |
+
"operation": {
|
17 |
+
"name": "convert_to_rgb",
|
18 |
+
"domain": "com.microsoft.extensions",
|
19 |
+
"type": "ConvertRGB"
|
20 |
+
}
|
21 |
+
},
|
22 |
+
{
|
23 |
+
"operation": {
|
24 |
+
"name": "phi3_image_transform",
|
25 |
+
"domain": "com.microsoft.extensions",
|
26 |
+
"type": "Phi3ImageTransform",
|
27 |
+
"attrs": {
|
28 |
+
"num_crops": 16,
|
29 |
+
"num_img_tokens": 144
|
30 |
+
}
|
31 |
+
}
|
32 |
+
}
|
33 |
+
]
|
34 |
+
}
|
35 |
+
}
|
directml-int4-rtn-block-32/special_tokens_map.json
ADDED
@@ -0,0 +1,36 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"additional_special_tokens": [
|
3 |
+
"<|system|>",
|
4 |
+
"<|end|>",
|
5 |
+
"<|user|>",
|
6 |
+
"<|end|>"
|
7 |
+
],
|
8 |
+
"bos_token": {
|
9 |
+
"content": "<s>",
|
10 |
+
"lstrip": false,
|
11 |
+
"normalized": false,
|
12 |
+
"rstrip": false,
|
13 |
+
"single_word": false
|
14 |
+
},
|
15 |
+
"eos_token": {
|
16 |
+
"content": "<|endoftext|>",
|
17 |
+
"lstrip": false,
|
18 |
+
"normalized": false,
|
19 |
+
"rstrip": false,
|
20 |
+
"single_word": false
|
21 |
+
},
|
22 |
+
"pad_token": {
|
23 |
+
"content": "<|endoftext|>",
|
24 |
+
"lstrip": false,
|
25 |
+
"normalized": false,
|
26 |
+
"rstrip": false,
|
27 |
+
"single_word": false
|
28 |
+
},
|
29 |
+
"unk_token": {
|
30 |
+
"content": "<unk>",
|
31 |
+
"lstrip": false,
|
32 |
+
"normalized": false,
|
33 |
+
"rstrip": false,
|
34 |
+
"single_word": false
|
35 |
+
}
|
36 |
+
}
|
directml-int4-rtn-block-32/tokenizer.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
directml-int4-rtn-block-32/tokenizer_config.json
ADDED
@@ -0,0 +1,407 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"add_bos_token": true,
|
3 |
+
"add_eos_token": false,
|
4 |
+
"added_tokens_decoder": {
|
5 |
+
"0": {
|
6 |
+
"content": "<unk>",
|
7 |
+
"lstrip": false,
|
8 |
+
"normalized": false,
|
9 |
+
"rstrip": false,
|
10 |
+
"single_word": false,
|
11 |
+
"special": true
|
12 |
+
},
|
13 |
+
"1": {
|
14 |
+
"content": "<s>",
|
15 |
+
"lstrip": false,
|
16 |
+
"normalized": false,
|
17 |
+
"rstrip": false,
|
18 |
+
"single_word": false,
|
19 |
+
"special": true
|
20 |
+
},
|
21 |
+
"2": {
|
22 |
+
"content": "</s>",
|
23 |
+
"lstrip": false,
|
24 |
+
"normalized": false,
|
25 |
+
"rstrip": true,
|
26 |
+
"single_word": false,
|
27 |
+
"special": false
|
28 |
+
},
|
29 |
+
"32000": {
|
30 |
+
"content": "<|endoftext|>",
|
31 |
+
"lstrip": false,
|
32 |
+
"normalized": false,
|
33 |
+
"rstrip": false,
|
34 |
+
"single_word": false,
|
35 |
+
"special": true
|
36 |
+
},
|
37 |
+
"32001": {
|
38 |
+
"content": "<|assistant|>",
|
39 |
+
"lstrip": false,
|
40 |
+
"normalized": false,
|
41 |
+
"rstrip": true,
|
42 |
+
"single_word": false,
|
43 |
+
"special": true
|
44 |
+
},
|
45 |
+
"32002": {
|
46 |
+
"content": "<|placeholder1|>",
|
47 |
+
"lstrip": false,
|
48 |
+
"normalized": false,
|
49 |
+
"rstrip": true,
|
50 |
+
"single_word": false,
|
51 |
+
"special": true
|
52 |
+
},
|
53 |
+
"32003": {
|
54 |
+
"content": "<|placeholder2|>",
|
55 |
+
"lstrip": false,
|
56 |
+
"normalized": false,
|
57 |
+
"rstrip": true,
|
58 |
+
"single_word": false,
|
59 |
+
"special": true
|
60 |
+
},
|
61 |
+
"32004": {
|
62 |
+
"content": "<|placeholder3|>",
|
63 |
+
"lstrip": false,
|
64 |
+
"normalized": false,
|
65 |
+
"rstrip": true,
|
66 |
+
"single_word": false,
|
67 |
+
"special": true
|
68 |
+
},
|
69 |
+
"32005": {
|
70 |
+
"content": "<|placeholder4|>",
|
71 |
+
"lstrip": false,
|
72 |
+
"normalized": false,
|
73 |
+
"rstrip": true,
|
74 |
+
"single_word": false,
|
75 |
+
"special": true
|
76 |
+
},
|
77 |
+
"32006": {
|
78 |
+
"content": "<|system|>",
|
79 |
+
"lstrip": false,
|
80 |
+
"normalized": false,
|
81 |
+
"rstrip": false,
|
82 |
+
"single_word": false,
|
83 |
+
"special": true
|
84 |
+
},
|
85 |
+
"32007": {
|
86 |
+
"content": "<|end|>",
|
87 |
+
"lstrip": false,
|
88 |
+
"normalized": false,
|
89 |
+
"rstrip": false,
|
90 |
+
"single_word": false,
|
91 |
+
"special": true
|
92 |
+
},
|
93 |
+
"32008": {
|
94 |
+
"content": "<|placeholder5|>",
|
95 |
+
"lstrip": false,
|
96 |
+
"normalized": false,
|
97 |
+
"rstrip": true,
|
98 |
+
"single_word": false,
|
99 |
+
"special": true
|
100 |
+
},
|
101 |
+
"32009": {
|
102 |
+
"content": "<|placeholder6|>",
|
103 |
+
"lstrip": false,
|
104 |
+
"normalized": false,
|
105 |
+
"rstrip": true,
|
106 |
+
"single_word": false,
|
107 |
+
"special": true
|
108 |
+
},
|
109 |
+
"32010": {
|
110 |
+
"content": "<|user|>",
|
111 |
+
"lstrip": false,
|
112 |
+
"normalized": false,
|
113 |
+
"rstrip": false,
|
114 |
+
"single_word": false,
|
115 |
+
"special": true
|
116 |
+
},
|
117 |
+
"32011": {
|
118 |
+
"content": "<|step|>",
|
119 |
+
"lstrip": false,
|
120 |
+
"normalized": false,
|
121 |
+
"rstrip": true,
|
122 |
+
"single_word": false,
|
123 |
+
"special": true
|
124 |
+
},
|
125 |
+
"32012": {
|
126 |
+
"content": "<|function_output|>",
|
127 |
+
"lstrip": false,
|
128 |
+
"normalized": false,
|
129 |
+
"rstrip": true,
|
130 |
+
"single_word": false,
|
131 |
+
"special": true
|
132 |
+
},
|
133 |
+
"32013": {
|
134 |
+
"content": "<|tag|>",
|
135 |
+
"lstrip": false,
|
136 |
+
"normalized": false,
|
137 |
+
"rstrip": true,
|
138 |
+
"single_word": false,
|
139 |
+
"special": true
|
140 |
+
},
|
141 |
+
"32014": {
|
142 |
+
"content": "<|function_call|>",
|
143 |
+
"lstrip": false,
|
144 |
+
"normalized": false,
|
145 |
+
"rstrip": true,
|
146 |
+
"single_word": false,
|
147 |
+
"special": true
|
148 |
+
},
|
149 |
+
"32015": {
|
150 |
+
"content": "<|raw|>",
|
151 |
+
"lstrip": false,
|
152 |
+
"normalized": false,
|
153 |
+
"rstrip": true,
|
154 |
+
"single_word": false,
|
155 |
+
"special": true
|
156 |
+
},
|
157 |
+
"32016": {
|
158 |
+
"content": "<|continue|>",
|
159 |
+
"lstrip": false,
|
160 |
+
"normalized": false,
|
161 |
+
"rstrip": true,
|
162 |
+
"single_word": false,
|
163 |
+
"special": true
|
164 |
+
},
|
165 |
+
"32017": {
|
166 |
+
"content": "<|function_list|>",
|
167 |
+
"lstrip": false,
|
168 |
+
"normalized": false,
|
169 |
+
"rstrip": true,
|
170 |
+
"single_word": false,
|
171 |
+
"special": true
|
172 |
+
},
|
173 |
+
"32018": {
|
174 |
+
"content": "<|calc|>",
|
175 |
+
"lstrip": false,
|
176 |
+
"normalized": false,
|
177 |
+
"rstrip": true,
|
178 |
+
"single_word": false,
|
179 |
+
"special": true
|
180 |
+
},
|
181 |
+
"32019": {
|
182 |
+
"content": "<|code|>",
|
183 |
+
"lstrip": false,
|
184 |
+
"normalized": false,
|
185 |
+
"rstrip": true,
|
186 |
+
"single_word": false,
|
187 |
+
"special": true
|
188 |
+
},
|
189 |
+
"32020": {
|
190 |
+
"content": "<|/code|>",
|
191 |
+
"lstrip": false,
|
192 |
+
"normalized": false,
|
193 |
+
"rstrip": true,
|
194 |
+
"single_word": false,
|
195 |
+
"special": true
|
196 |
+
},
|
197 |
+
"32021": {
|
198 |
+
"content": "<|summary|>",
|
199 |
+
"lstrip": false,
|
200 |
+
"normalized": false,
|
201 |
+
"rstrip": true,
|
202 |
+
"single_word": false,
|
203 |
+
"special": true
|
204 |
+
},
|
205 |
+
"32022": {
|
206 |
+
"content": "<|resource|>",
|
207 |
+
"lstrip": false,
|
208 |
+
"normalized": false,
|
209 |
+
"rstrip": true,
|
210 |
+
"single_word": false,
|
211 |
+
"special": true
|
212 |
+
},
|
213 |
+
"32023": {
|
214 |
+
"content": "<|assistant_mask|>",
|
215 |
+
"lstrip": false,
|
216 |
+
"normalized": false,
|
217 |
+
"rstrip": true,
|
218 |
+
"single_word": false,
|
219 |
+
"special": true
|
220 |
+
},
|
221 |
+
"32024": {
|
222 |
+
"content": "<|start|>",
|
223 |
+
"lstrip": false,
|
224 |
+
"normalized": false,
|
225 |
+
"rstrip": true,
|
226 |
+
"single_word": false,
|
227 |
+
"special": true
|
228 |
+
},
|
229 |
+
"32025": {
|
230 |
+
"content": "<|message|>",
|
231 |
+
"lstrip": false,
|
232 |
+
"normalized": false,
|
233 |
+
"rstrip": true,
|
234 |
+
"single_word": false,
|
235 |
+
"special": true
|
236 |
+
},
|
237 |
+
"32026": {
|
238 |
+
"content": "<|fim_prefix|>",
|
239 |
+
"lstrip": false,
|
240 |
+
"normalized": false,
|
241 |
+
"rstrip": true,
|
242 |
+
"single_word": false,
|
243 |
+
"special": true
|
244 |
+
},
|
245 |
+
"32027": {
|
246 |
+
"content": "<|fim_middle|>",
|
247 |
+
"lstrip": false,
|
248 |
+
"normalized": false,
|
249 |
+
"rstrip": true,
|
250 |
+
"single_word": false,
|
251 |
+
"special": true
|
252 |
+
},
|
253 |
+
"32028": {
|
254 |
+
"content": "<|fim_suffix|>",
|
255 |
+
"lstrip": false,
|
256 |
+
"normalized": false,
|
257 |
+
"rstrip": true,
|
258 |
+
"single_word": false,
|
259 |
+
"special": true
|
260 |
+
},
|
261 |
+
"32029": {
|
262 |
+
"content": "<|meta_start|>",
|
263 |
+
"lstrip": false,
|
264 |
+
"normalized": false,
|
265 |
+
"rstrip": true,
|
266 |
+
"single_word": false,
|
267 |
+
"special": true
|
268 |
+
},
|
269 |
+
"32030": {
|
270 |
+
"content": "<|ipynb_marker|>",
|
271 |
+
"lstrip": false,
|
272 |
+
"normalized": false,
|
273 |
+
"rstrip": true,
|
274 |
+
"single_word": false,
|
275 |
+
"special": true
|
276 |
+
},
|
277 |
+
"32031": {
|
278 |
+
"content": "<|diff_marker|>",
|
279 |
+
"lstrip": false,
|
280 |
+
"normalized": false,
|
281 |
+
"rstrip": true,
|
282 |
+
"single_word": false,
|
283 |
+
"special": true
|
284 |
+
},
|
285 |
+
"32032": {
|
286 |
+
"content": "<|ghissue|>",
|
287 |
+
"lstrip": false,
|
288 |
+
"normalized": false,
|
289 |
+
"rstrip": true,
|
290 |
+
"single_word": false,
|
291 |
+
"special": true
|
292 |
+
},
|
293 |
+
"32033": {
|
294 |
+
"content": "<|ghreview|>",
|
295 |
+
"lstrip": false,
|
296 |
+
"normalized": false,
|
297 |
+
"rstrip": true,
|
298 |
+
"single_word": false,
|
299 |
+
"special": true
|
300 |
+
},
|
301 |
+
"32034": {
|
302 |
+
"content": "<|disc_start|>",
|
303 |
+
"lstrip": false,
|
304 |
+
"normalized": false,
|
305 |
+
"rstrip": true,
|
306 |
+
"single_word": false,
|
307 |
+
"special": true
|
308 |
+
},
|
309 |
+
"32035": {
|
310 |
+
"content": "<|disc_sep|>",
|
311 |
+
"lstrip": false,
|
312 |
+
"normalized": false,
|
313 |
+
"rstrip": true,
|
314 |
+
"single_word": false,
|
315 |
+
"special": true
|
316 |
+
},
|
317 |
+
"32036": {
|
318 |
+
"content": "<|disc_thread|><|query|>",
|
319 |
+
"lstrip": false,
|
320 |
+
"normalized": false,
|
321 |
+
"rstrip": true,
|
322 |
+
"single_word": false,
|
323 |
+
"special": true
|
324 |
+
},
|
325 |
+
"32037": {
|
326 |
+
"content": "<|/query|>",
|
327 |
+
"lstrip": false,
|
328 |
+
"normalized": false,
|
329 |
+
"rstrip": true,
|
330 |
+
"single_word": false,
|
331 |
+
"special": true
|
332 |
+
},
|
333 |
+
"32038": {
|
334 |
+
"content": "<|data|>",
|
335 |
+
"lstrip": false,
|
336 |
+
"normalized": false,
|
337 |
+
"rstrip": true,
|
338 |
+
"single_word": false,
|
339 |
+
"special": true
|
340 |
+
},
|
341 |
+
"32039": {
|
342 |
+
"content": "<|/data|>",
|
343 |
+
"lstrip": false,
|
344 |
+
"normalized": false,
|
345 |
+
"rstrip": true,
|
346 |
+
"single_word": false,
|
347 |
+
"special": true
|
348 |
+
},
|
349 |
+
"32040": {
|
350 |
+
"content": "<|sys|>",
|
351 |
+
"lstrip": false,
|
352 |
+
"normalized": false,
|
353 |
+
"rstrip": true,
|
354 |
+
"single_word": false,
|
355 |
+
"special": true
|
356 |
+
},
|
357 |
+
"32041": {
|
358 |
+
"content": "<|/sys|>",
|
359 |
+
"lstrip": false,
|
360 |
+
"normalized": false,
|
361 |
+
"rstrip": true,
|
362 |
+
"single_word": false,
|
363 |
+
"special": true
|
364 |
+
},
|
365 |
+
"32042": {
|
366 |
+
"content": "<|inst|>",
|
367 |
+
"lstrip": false,
|
368 |
+
"normalized": false,
|
369 |
+
"rstrip": true,
|
370 |
+
"single_word": false,
|
371 |
+
"special": true
|
372 |
+
},
|
373 |
+
"32043": {
|
374 |
+
"content": "<|/inst|>",
|
375 |
+
"lstrip": false,
|
376 |
+
"normalized": false,
|
377 |
+
"rstrip": true,
|
378 |
+
"single_word": false,
|
379 |
+
"special": true
|
380 |
+
},
|
381 |
+
"32044": {
|
382 |
+
"content": "<|image|>",
|
383 |
+
"lstrip": false,
|
384 |
+
"normalized": false,
|
385 |
+
"rstrip": true,
|
386 |
+
"single_word": false,
|
387 |
+
"special": true
|
388 |
+
}
|
389 |
+
},
|
390 |
+
"additional_special_tokens": [
|
391 |
+
"<|system|>",
|
392 |
+
"<|end|>",
|
393 |
+
"<|user|>",
|
394 |
+
"<|end|>"
|
395 |
+
],
|
396 |
+
"bos_token": "<s>",
|
397 |
+
"chat_template": "{% for message in messages %}{{'<|' + message['role'] + '|>' + '\n' + message['content'] + '<|end|>\n' }}{% endfor %}{% if add_generation_prompt and messages[-1]['role'] != 'assistant' %}{{- '<|assistant|>\n' -}}{% endif %}",
|
398 |
+
"clean_up_tokenization_spaces": false,
|
399 |
+
"eos_token": "<|endoftext|>",
|
400 |
+
"model_max_length": 131072,
|
401 |
+
"pad_token": "<|endoftext|>",
|
402 |
+
"padding_side": "right",
|
403 |
+
"sp_model_kwargs": {},
|
404 |
+
"tokenizer_class": "LlamaTokenizer",
|
405 |
+
"unk_token": "<unk>",
|
406 |
+
"use_default_system_prompt": false
|
407 |
+
}
|