LoneStriker commited on
Commit
2185b48
1 Parent(s): 4a2bfdc

Upload folder using huggingface_hub

Browse files
LICENSE ADDED
@@ -0,0 +1,71 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ MICROSOFT RESEARCH LICENSE TERMS
2
+
3
+ IF YOU LIVE IN THE UNITED STATES, PLEASE READ THE “BINDING ARBITRATION AND CLASS ACTION WAIVER” SECTION BELOW. IT AFFECTS HOW DISPUTES ARE RESOLVED.
4
+
5
+ These license terms are an agreement between you and Microsoft Corporation (or one of its affiliates). They apply to the source code, object code, machine learning models, or data (collectively “Materials”) that accompany this license. IF YOU COMPLY WITH THESE LICENSE TERMS, YOU HAVE THE RIGHTS BELOW. BY USING THE MATERIALS, YOU ACCEPT THESE TERMS.
6
+
7
+ 1) INSTALLATION AND USE RIGHTS TO THE MATERIALS.
8
+
9
+ Subject to the terms of this agreement, you have the below rights, if applicable, to use the Materials solely for non-commercial, non-revenue generating, research purposes:
10
+
11
+ a) Source Code. If source code is included, you may use and modify the source code, but you may not distribute the source code.
12
+
13
+ b) Object Code. If object code is included, you may use the object code, but you may not distribute the object code.
14
+
15
+ c) Models. If machine learning model(s) are included, you may use the model(s), but you may not distribute the models.
16
+
17
+ d) Data. If data is included, you may use and modify the data, but your use and modification must be consistent with the consent under which the data was provided and/or gathered and you may not distribute the data or your modifications to the data.
18
+
19
+ 2) SCOPE OF LICENSE. The Materials are licensed, not sold. Microsoft reserves all other rights. Unless applicable law gives you more rights despite this limitation, you will not (and have no right to):
20
+
21
+ a) work around any technical limitations in the Materials that only allow you to use it in certain ways;
22
+
23
+ b) reverse engineer, decompile or disassemble the Materials;
24
+
25
+ c) remove, minimize, block, or modify any notices of Microsoft or its suppliers in the Materials;
26
+
27
+ d) use the Materials in any way that is against the law or to create or propagate malware; or
28
+
29
+ e) share, publish, distribute or lend the Materials, provide the Materials as a stand-alone hosted solution for others to use, or transfer the Materials or this agreement to any third party.
30
+
31
+ 3) PERSONAL DATA. If the data (set forth in Section 1(c) above) includes or is found to include any data that enables any ability to identify an individual (“Personal Data”), you will not use such Personal Data for any purpose other than was authorized and consented to by the data subject/research participant. You will not use Personal Data to contact any person. You will keep Personal Data in strict confidence. You will not share any Personal Data that is collected or in your possession with any third party for any reason and as required under the original consent agreement. Further, you will destroy the Personal Data and any backup or copies, immediately upon the completion of your research.
32
+
33
+ 4) LICENSE TO MICROSOFT. Notwithstanding the limitations in Section 1, you may distribute your modifications back to Microsoft, and if you do provide Microsoft with modifications of the Materials, you hereby grant Microsoft, without any restrictions or limitations, a non-exclusive, perpetual, irrevocable, royalty-free, assignable and sub-licensable license, to reproduce, publicly perform or display, install, use, modify, post, distribute, make and have made, sell and transfer such modifications and derivatives for any purpose.
34
+
35
+ 5) PUBLICATION. You may publish (or present papers or articles) on your results from using the Materials provided that no material or substantial portion of the Materials is included in any such publication or presentation.
36
+
37
+ 6) FEEDBACK. Any feedback about the Materials provided by you to us is voluntarily given, and Microsoft shall be free to use the feedback as it sees fit without obligation or restriction of any kind, even if the
38
+
39
+ feedback is designated by you as confidential. Such feedback shall be considered a contribution and licensed to Microsoft under the terms of Section 4 above.
40
+
41
+ 7) EXPORT RESTRICTIONS. You must comply with all domestic and international export laws and regulations that apply to the Materials, which include restrictions on destinations, end users, and end use. For further information on export restrictions, visit (aka.ms/exporting).
42
+
43
+ 8) SUPPORT SERVICES. Microsoft is not obligated under this agreement to provide any support services for the Materials. Any support provided is “as is”, “with all faults”, and without warranty of any kind.
44
+
45
+ 9) BINDING ARBITRATION AND CLASS ACTION WAIVER. This Section applies if you live in (or, if a business, your principal place of business is in) the United States. If you and Microsoft have a dispute, you and Microsoft agree to try for 60 days to resolve it informally. If you and Microsoft can’t, you and Microsoft agree to binding individual arbitration before the American Arbitration Association under the Federal Arbitration Act (“FAA”), and not to sue in court in front of a judge or jury. Instead, a neutral arbitrator will decide. Class action lawsuits, class-wide arbitrations, private attorney-general actions, and any other proceeding where someone acts in a representative capacity are not allowed; nor is combining individual proceedings without the consent of all parties. The complete Arbitration Agreement contains more terms and is at aka.ms/arb-agreement-1. You and Microsoft agree to these terms.
46
+
47
+ 10) ENTIRE AGREEMENT. This agreement, and any other terms Microsoft may provide for supplements, updates, or third-party applications, is the entire agreement for the Materials.
48
+
49
+ 11) APPLICABLE LAW AND PLACE TO RESOLVE DISPUTES. If you acquired the Materials in the United States or Canada, the laws of the state or province where you live (or, if a business, where your principal place of business is located) govern the interpretation of this agreement, claims for its breach, and all other claims (including consumer protection, unfair competition, and tort claims), regardless of conflict of laws principles, except that the FAA governs everything related to arbitration. If you acquired the Materials in any other country, its laws apply, except that the FAA governs everything related to arbitration. If U.S. federal jurisdiction exists, you and Microsoft consent to exclusive jurisdiction and venue in the federal court in King County, Washington for all disputes heard in court (excluding arbitration). If not, you and Microsoft consent to exclusive jurisdiction and venue in the Superior Court of King County, Washington for all disputes heard in court (excluding arbitration).
50
+
51
+ 12) CONSUMER RIGHTS; REGIONAL VARIATIONS. This agreement describes certain legal rights. You may have other rights, including consumer rights, under the laws of your state, province, or country. Separate and apart from your relationship with Microsoft, you may also have rights with respect to the party from which you acquired the Materials. This agreement does not change those other rights if the laws of your state, province, or country do not permit it to do so. For example, if you acquired the Materials in one of the below regions, or mandatory country law applies, then the following provisions apply to you:
52
+
53
+ a) Australia. You have statutory guarantees under the Australian Consumer Law and nothing in this agreement is intended to affect those rights.
54
+
55
+ b) Canada. If you acquired this software in Canada, you may stop receiving updates by turning off the automatic update feature, disconnecting your device from the Internet (if and when you re-connect to the Internet, however, the Materials will resume checking for and installing updates), or uninstalling the Materials. The product documentation, if any, may also specify how to turn off updates for your specific device or software.
56
+
57
+ c) Germany and Austria.
58
+
59
+ i. Warranty. The properly licensed software will perform substantially as described in any Microsoft materials that accompany the Materials. However, Microsoft gives no contractual guarantee in relation to the licensed software.
60
+
61
+ ii. Limitation of Liability. In case of intentional conduct, gross negligence, claims based on the Product Liability Act, as well as, in case of death or personal or physical injury, Microsoft is liable according to the statutory law.
62
+
63
+ Subject to the foregoing clause (ii), Microsoft will only be liable for slight negligence if Microsoft is in breach of such material contractual obligations, the fulfillment of which facilitate the due performance of this agreement, the breach of which would endanger the purpose of this agreement and the compliance with which a party may constantly trust in (so-called "cardinal obligations"). In other cases of slight negligence, Microsoft will not be liable for slight negligence.
64
+
65
+ 13) DISCLAIMER OF WARRANTY. THE MATERIALS ARE LICENSED “AS IS.” YOU BEAR THE RISK OF USING THEM. MICROSOFT GIVES NO EXPRESS WARRANTIES, GUARANTEES, OR CONDITIONS. TO THE EXTENT PERMITTED UNDER APPLICABLE LAWS, MICROSOFT EXCLUDES ALL IMPLIED WARRANTIES, INCLUDING MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NON-INFRINGEMENT.
66
+
67
+ 14) LIMITATION ON AND EXCLUSION OF DAMAGES. IF YOU HAVE ANY BASIS FOR RECOVERING DAMAGES DESPITE THE PRECEDING DISCLAIMER OF WARRANTY, YOU CAN RECOVER FROM MICROSOFT AND ITS SUPPLIERS ONLY DIRECT DAMAGES UP TO U.S. $5.00. YOU CANNOT RECOVER ANY OTHER DAMAGES, INCLUDING CONSEQUENTIAL, LOST PROFITS, SPECIAL, INDIRECT OR INCIDENTAL DAMAGES.
68
+
69
+ This limitation applies to (a) anything related to the Materials, services, content (including code) on third party Internet sites, or third party applications; and (b) claims for breach of contract, warranty, guarantee, or condition; strict liability, negligence, or other tort; or any other claim; in each case to the extent permitted by applicable law.
70
+
71
+ It also applies even if Microsoft knew or should have known about the possibility of the damages. The above limitation or exclusion may not apply to you because your state, province, or country may not allow the exclusion or limitation of incidental, consequential, or other damages.
README.md ADDED
@@ -0,0 +1,251 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ metrics:
3
+ - code_eval
4
+ library_name: transformers
5
+ tags:
6
+ - code
7
+ model-index:
8
+ - name: WizardCoder
9
+ results:
10
+ - task:
11
+ type: text-generation
12
+ dataset:
13
+ type: openai_humaneval
14
+ name: HumanEval
15
+ metrics:
16
+ - name: pass@1
17
+ type: pass@1
18
+ value: 0.799
19
+ verified: false
20
+ ---
21
+
22
+ ## WizardCoder: Empowering Code Large Language Models with Evol-Instruct
23
+
24
+ <p style="font-size:28px;" align="center">
25
+ 🏠 <a href="https://wizardlm.github.io/" target="_blank">Home Page</a> </p>
26
+ <p align="center">
27
+ <p align="center">
28
+ 🤗 <a href="https://huggingface.co/WizardLM" target="_blank">HF Repo</a> •🐱 <a href="https://github.com/nlpxucan/WizardLM" target="_blank">Github Repo</a> • 🐦 <a href="https://twitter.com/WizardLM_AI" target="_blank">Twitter</a> </p>
29
+ <p align="center">
30
+ 📃 <a href="https://arxiv.org/abs/2304.12244" target="_blank">[WizardLM]</a> • 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> • 📃 <a href="https://arxiv.org/abs/2308.09583" target="_blank">[WizardMath]</a> <br>
31
+ </p>
32
+ <p align="center">
33
+ 👋 Join our <a href="https://discord.gg/VZjjHtWrKs" target="_blank">Discord</a>
34
+ </p>
35
+
36
+ ## News
37
+
38
+ [2023/01/04] 🔥 We released **WizardCoder-33B-V1.1** trained from deepseek-coder-33b-base, the **SOTA OSS Code LLM** on [EvalPlus Leaderboard](https://evalplus.github.io/leaderboard.html), achieves **79.9 pass@1** on HumanEval, **73.2 pass@1** on HumanEval-Plus, **78.9 pass@1** on MBPP, and **66.9 pass@1** on MBPP-Plus.
39
+
40
+ [2023/01/04] 🔥 **WizardCoder-33B-V1.1** outperforms **ChatGPT 3.5**, **Gemini Pro**, and **DeepSeek-Coder-33B-instruct** on HumanEval and HumanEval-Plus pass@1.
41
+
42
+ [2023/01/04] 🔥 **WizardCoder-33B-V1.1** is comparable with **ChatGPT 3.5**, and surpasses **Gemini Pro** on MBPP and MBPP-Plus pass@1.
43
+
44
+ | Model | Checkpoint | Paper | HumanEval | HumanEval+ | MBPP | MBPP+ | License |
45
+ | ----- |------| ---- |------|-------| ----- | ----- |----- |
46
+ | GPT-4-Turbo (Nov 2023) | - | - | 85.4 | 81.7 | 83.0 | 70.7 |-|
47
+ | GPT-4 (May 2023) | - | - | 88.4 | 76.8 | - | - |-|
48
+ | GPT-3.5-Turbo (Nov 2023) | - | - | 72.6 | 65.9 | 81.7 | 69.4 |-|
49
+ | Gemini Pro | - | - | 63.4 | 55.5 | 72.9 | 57.9 |-|
50
+ | DeepSeek-Coder-33B-instruct | - | - | 78.7 | 72.6 | 78.7 | 66.7 |-|
51
+ | **WizardCoder-33B-V1.1** | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-33B-V1.1" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 79.9 | 73.2 | 78.9 | 66.9 | <a href="https://huggingface.co/WizardLM/WizardMath-7B-V1.1/resolve/main/LICENSE" target="_blank">MSFTResearch</a> |
52
+ | WizardCoder-Python-34B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-Python-34B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 73.2 | 64.6 | 73.2 | 59.9 | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> |
53
+ | WizardCoder-15B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-15B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 59.8 | 52.4 | -- | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> |
54
+ | WizardCoder-Python-13B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-Python-13B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 64.0 | -- | -- | -- | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> |
55
+ | WizardCoder-Python-7B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-Python-7B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 55.5 | -- | -- | -- | <a href="https://ai.meta.com/resources/models-and-libraries/llama-downloads/" target="_blank">Llama2</a> |
56
+ | WizardCoder-3B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-3B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 34.8 | -- | -- | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> |
57
+ | WizardCoder-1B-V1.0 | 🤗 <a href="https://huggingface.co/WizardLM/WizardCoder-1B-V1.0" target="_blank">HF Link</a> | 📃 <a href="https://arxiv.org/abs/2306.08568" target="_blank">[WizardCoder]</a> | 23.8 | -- | -- | -- | <a href="https://huggingface.co/spaces/bigcode/bigcode-model-license-agreement" target="_blank">OpenRAIL-M</a> |
58
+
59
+
60
+ ## ❗ Data Contamination Check:
61
+
62
+ Before model training, we carefully and rigorously checked all the training data, and used multiple deduplication methods to verify and prevent data leakage on HumanEval and MBPP test set.
63
+
64
+ 🔥
65
+ ❗<b>Note for model system prompts usage:</b>
66
+
67
+ Please use **the same systems prompts strictly** with us, and we do not guarantee the accuracy of the **quantified versions**.
68
+
69
+ **Default version:**
70
+
71
+ ```
72
+ "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n### Instruction:\n{instruction}\n\n### Response:"
73
+ ```
74
+
75
+
76
+ ## How to Reproduce the Performance of WizardCoder-33B-V1.1
77
+
78
+ We provide all codes [here](https://github.com/nlpxucan/WizardLM/tree/main/WizardCoder/src).
79
+
80
+ We also provide all generated [results](https://github.com/nlpxucan/WizardLM/blob/main/WizardCoder/data/humaneval_mbpp_wizardcoder33b_v1.1_results.zip).
81
+
82
+ ```
83
+ transformers==4.36.2
84
+ vllm==0.2.5
85
+ ```
86
+
87
+ (1) HumanEval and HumanEval-Plus
88
+
89
+ - Step 1
90
+
91
+ Code Generation (w/o accelerate)
92
+ ```bash
93
+ model="WizardLM/WizardCoder-33B-V1.1"
94
+ temp=0.0
95
+ max_len=2048
96
+ pred_num=1
97
+ num_seqs_per_iter=1
98
+
99
+ output_path=preds/T${temp}_N${pred_num}_WizardCoder-33B-V1.1_Greedy_Decode
100
+
101
+ mkdir -p ${output_path}
102
+ echo 'Output path: '$output_path
103
+ echo 'Model to eval: '$model
104
+
105
+ # 164 problems, 21 per GPU if GPU=8
106
+ index=0
107
+ gpu_num=8
108
+ for ((i = 0; i < $gpu_num; i++)); do
109
+ start_index=$((i * 21))
110
+ end_index=$(((i + 1) * 21))
111
+
112
+ gpu=$((i))
113
+ echo 'Running process #' ${i} 'from' $start_index 'to' $end_index 'on GPU' ${gpu}
114
+ ((index++))
115
+ (
116
+ CUDA_VISIBLE_DEVICES=$gpu python humaneval_gen.py --model ${model} \
117
+ --start_index ${start_index} --end_index ${end_index} --temperature ${temp} \
118
+ --num_seqs_per_iter ${num_seqs_per_iter} --N ${pred_num} --max_len ${max_len} --output_path ${output_path} --greedy_decode
119
+ ) &
120
+ if (($index % $gpu_num == 0)); then wait; fi
121
+ done
122
+ ```
123
+
124
+ Code Generation (w/ vllm accelerate)
125
+ ```bash
126
+ model="WizardLM/WizardCoder-33B-V1.1"
127
+ temp=0.0
128
+ max_len=2048
129
+ pred_num=1
130
+ num_seqs_per_iter=1
131
+
132
+ output_path=preds/T${temp}_N${pred_num}_WizardCoder-33B-V1.1_Greedy_Decode_vllm
133
+
134
+ mkdir -p ${output_path}
135
+ echo 'Output path: '$output_path
136
+ echo 'Model to eval: '$model
137
+
138
+ CUDA_VISIBLE_DEVICES=0,1,2,3 python humaneval_gen_vllm.py --model ${model} \
139
+ --start_index 0 --end_index 164 --temperature ${temp} \
140
+ --num_seqs_per_iter ${num_seqs_per_iter} --N ${pred_num} --max_len ${max_len} --output_path ${output_path} --num_gpus 4 --overwrite
141
+ ```
142
+
143
+ - Step 2: Get the score
144
+
145
+ Install [Eval-Plus](https://github.com/evalplus/evalplus) benchmark.
146
+ ```bash
147
+ git clone https://github.com/evalplus/evalplus.git
148
+ cd evalplus
149
+ export PYTHONPATH=$PYTHONPATH:$(pwd)
150
+ pip install -r requirements.txt
151
+ ```
152
+ Get HumanEval and HumanEval-Plus scores.
153
+ ```bash
154
+ output_path=preds/T0.0_N1_WizardCoder-33B-V1.1_Greedy_Decode
155
+
156
+ echo 'Output path: '$output_path
157
+ python process_humaneval.py --path ${output_path} --out_path ${output_path}.jsonl --add_prompt
158
+
159
+ evalplus.evaluate --dataset humaneval --samples ${output_path}.jsonl
160
+ ```
161
+
162
+ (2) MBPP and MBPP-Plus
163
+
164
+ The preprocessed questions are provided in [mbppplus.json](https://github.com/nlpxucan/WizardLM/blob/main/WizardCoder/data/mbppplus.json).
165
+
166
+ - Step 1
167
+
168
+ Code Generation (w/o accelerate)
169
+ ```bash
170
+ model="WizardLM/WizardCoder-33B-V1.1"
171
+ temp=0.0
172
+ max_len=2048
173
+ pred_num=1
174
+ num_seqs_per_iter=1
175
+
176
+ output_path=preds/MBPP_T${temp}_N${pred_num}_WizardCoder-33B-V1.1_Greedy_Decode
177
+
178
+ mkdir -p ${output_path}
179
+ echo 'Output path: '$output_path
180
+ echo 'Model to eval: '$model
181
+
182
+ # 399 problems, 50 per GPU if GPU=8
183
+ index=0
184
+ gpu_num=8
185
+ for ((i = 0; i < $gpu_num; i++)); do
186
+ start_index=$((i * 50))
187
+ end_index=$(((i + 1) * 50))
188
+
189
+ gpu=$((i))
190
+ echo 'Running process #' ${i} 'from' $start_index 'to' $end_index 'on GPU' ${gpu}
191
+ ((index++))
192
+ (
193
+ CUDA_VISIBLE_DEVICES=$gpu python mbppplus_gen.py --model ${model} \
194
+ --start_index ${start_index} --end_index ${end_index} --temperature ${temp} \
195
+ --num_seqs_per_iter ${num_seqs_per_iter} --N ${pred_num} --max_len ${max_len} --output_path ${output_path} --mbpp_path "mbppplus.json" --greedy_decode
196
+ ) &
197
+ if (($index % $gpu_num == 0)); then wait; fi
198
+ done
199
+ ```
200
+
201
+ Code Generation (w/ vllm accelerate)
202
+ ```bash
203
+ model="WizardLM/WizardCoder-33B-V1.1"
204
+ temp=0.0
205
+ max_len=2048
206
+ pred_num=1
207
+ num_seqs_per_iter=1
208
+
209
+ output_path=preds/MBPP_T${temp}_N${pred_num}_WizardCoder-33B-V1.1_Greedy_Decode_vllm
210
+
211
+ mkdir -p ${output_path}
212
+ echo 'Output path: '$output_path
213
+ echo 'Model to eval: '$model
214
+
215
+ CUDA_VISIBLE_DEVICES=0,1,2,3 python mbppplus_gen_vllm.py --model ${model} \
216
+ --start_index ${start_index} --end_index ${end_index} --temperature ${temp} \
217
+ --num_seqs_per_iter ${num_seqs_per_iter} --N ${pred_num} --max_len ${max_len} --output_path ${output_path} --mbpp_path "mbppplus.json" --num_gpus 4
218
+ ```
219
+
220
+ - Step 2: Get the score
221
+
222
+ Install [Eval-Plus](https://github.com/evalplus/evalplus) benchmark.
223
+ ```bash
224
+ git clone https://github.com/evalplus/evalplus.git
225
+ cd evalplus
226
+ export PYTHONPATH=$PYTHONPATH:$(pwd)
227
+ pip install -r requirements.txt
228
+ ```
229
+ Get HumanEval and HumanEval-Plus scores.
230
+ ```bash
231
+ output_path=preds/MBPP_T0.0_N1_WizardCoder-33B-V1.1_Greedy_Decode
232
+
233
+ echo 'Output path: '$output_path
234
+ python mbppplus_process_preds.py --path ${output_path} --out_path ${output_path}.jsonl --add_prompt
235
+
236
+ evalplus.evaluate --dataset mbpp --samples ${output_path}.jsonl
237
+ ```
238
+
239
+
240
+ ## Citation
241
+
242
+ Please cite the repo if you use the data, method or code in this repo.
243
+
244
+ ```
245
+ @article{luo2023wizardcoder,
246
+ title={WizardCoder: Empowering Code Large Language Models with Evol-Instruct},
247
+ author={Luo, Ziyang and Xu, Can and Zhao, Pu and Sun, Qingfeng and Geng, Xiubo and Hu, Wenxiang and Tao, Chongyang and Ma, Jing and Lin, Qingwei and Jiang, Daxin},
248
+ journal={arXiv preprint arXiv:2306.08568},
249
+ year={2023}
250
+ }
251
+ ```
config.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "LlamaForCausalLM"
4
+ ],
5
+ "attention_bias": false,
6
+ "attention_dropout": 0.0,
7
+ "bos_token_id": 32013,
8
+ "eos_token_id": 32014,
9
+ "hidden_act": "silu",
10
+ "hidden_size": 7168,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 19200,
13
+ "max_position_embeddings": 16384,
14
+ "model_type": "llama",
15
+ "num_attention_heads": 56,
16
+ "num_hidden_layers": 62,
17
+ "num_key_value_heads": 8,
18
+ "pretraining_tp": 1,
19
+ "rms_norm_eps": 1e-06,
20
+ "rope_scaling": {
21
+ "factor": 4.0,
22
+ "type": "linear"
23
+ },
24
+ "rope_theta": 100000,
25
+ "tie_word_embeddings": false,
26
+ "torch_dtype": "bfloat16",
27
+ "transformers_version": "4.36.2",
28
+ "use_cache": false,
29
+ "vocab_size": 32256
30
+ }
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 32013,
4
+ "eos_token_id": 32014,
5
+ "transformers_version": "4.36.2"
6
+ }
model.safetensors.index.json ADDED
@@ -0,0 +1,568 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 66685982720
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00014-of-00014.safetensors",
7
+ "model.embed_tokens.weight": "model-00001-of-00014.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00014.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00014.safetensors",
10
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00014.safetensors",
11
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00014.safetensors",
12
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00014.safetensors",
13
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00014.safetensors",
14
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00014.safetensors",
15
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00014.safetensors",
16
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00014.safetensors",
17
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00014.safetensors",
18
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00014.safetensors",
19
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00014.safetensors",
20
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00014.safetensors",
21
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00014.safetensors",
22
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00014.safetensors",
23
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00014.safetensors",
24
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00014.safetensors",
25
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00014.safetensors",
26
+ "model.layers.10.input_layernorm.weight": "model-00003-of-00014.safetensors",
27
+ "model.layers.10.mlp.down_proj.weight": "model-00003-of-00014.safetensors",
28
+ "model.layers.10.mlp.gate_proj.weight": "model-00003-of-00014.safetensors",
29
+ "model.layers.10.mlp.up_proj.weight": "model-00003-of-00014.safetensors",
30
+ "model.layers.10.post_attention_layernorm.weight": "model-00003-of-00014.safetensors",
31
+ "model.layers.10.self_attn.k_proj.weight": "model-00003-of-00014.safetensors",
32
+ "model.layers.10.self_attn.o_proj.weight": "model-00003-of-00014.safetensors",
33
+ "model.layers.10.self_attn.q_proj.weight": "model-00003-of-00014.safetensors",
34
+ "model.layers.10.self_attn.v_proj.weight": "model-00003-of-00014.safetensors",
35
+ "model.layers.11.input_layernorm.weight": "model-00003-of-00014.safetensors",
36
+ "model.layers.11.mlp.down_proj.weight": "model-00003-of-00014.safetensors",
37
+ "model.layers.11.mlp.gate_proj.weight": "model-00003-of-00014.safetensors",
38
+ "model.layers.11.mlp.up_proj.weight": "model-00003-of-00014.safetensors",
39
+ "model.layers.11.post_attention_layernorm.weight": "model-00003-of-00014.safetensors",
40
+ "model.layers.11.self_attn.k_proj.weight": "model-00003-of-00014.safetensors",
41
+ "model.layers.11.self_attn.o_proj.weight": "model-00003-of-00014.safetensors",
42
+ "model.layers.11.self_attn.q_proj.weight": "model-00003-of-00014.safetensors",
43
+ "model.layers.11.self_attn.v_proj.weight": "model-00003-of-00014.safetensors",
44
+ "model.layers.12.input_layernorm.weight": "model-00003-of-00014.safetensors",
45
+ "model.layers.12.mlp.down_proj.weight": "model-00003-of-00014.safetensors",
46
+ "model.layers.12.mlp.gate_proj.weight": "model-00003-of-00014.safetensors",
47
+ "model.layers.12.mlp.up_proj.weight": "model-00003-of-00014.safetensors",
48
+ "model.layers.12.post_attention_layernorm.weight": "model-00003-of-00014.safetensors",
49
+ "model.layers.12.self_attn.k_proj.weight": "model-00003-of-00014.safetensors",
50
+ "model.layers.12.self_attn.o_proj.weight": "model-00003-of-00014.safetensors",
51
+ "model.layers.12.self_attn.q_proj.weight": "model-00003-of-00014.safetensors",
52
+ "model.layers.12.self_attn.v_proj.weight": "model-00003-of-00014.safetensors",
53
+ "model.layers.13.input_layernorm.weight": "model-00004-of-00014.safetensors",
54
+ "model.layers.13.mlp.down_proj.weight": "model-00004-of-00014.safetensors",
55
+ "model.layers.13.mlp.gate_proj.weight": "model-00004-of-00014.safetensors",
56
+ "model.layers.13.mlp.up_proj.weight": "model-00004-of-00014.safetensors",
57
+ "model.layers.13.post_attention_layernorm.weight": "model-00004-of-00014.safetensors",
58
+ "model.layers.13.self_attn.k_proj.weight": "model-00003-of-00014.safetensors",
59
+ "model.layers.13.self_attn.o_proj.weight": "model-00003-of-00014.safetensors",
60
+ "model.layers.13.self_attn.q_proj.weight": "model-00003-of-00014.safetensors",
61
+ "model.layers.13.self_attn.v_proj.weight": "model-00003-of-00014.safetensors",
62
+ "model.layers.14.input_layernorm.weight": "model-00004-of-00014.safetensors",
63
+ "model.layers.14.mlp.down_proj.weight": "model-00004-of-00014.safetensors",
64
+ "model.layers.14.mlp.gate_proj.weight": "model-00004-of-00014.safetensors",
65
+ "model.layers.14.mlp.up_proj.weight": "model-00004-of-00014.safetensors",
66
+ "model.layers.14.post_attention_layernorm.weight": "model-00004-of-00014.safetensors",
67
+ "model.layers.14.self_attn.k_proj.weight": "model-00004-of-00014.safetensors",
68
+ "model.layers.14.self_attn.o_proj.weight": "model-00004-of-00014.safetensors",
69
+ "model.layers.14.self_attn.q_proj.weight": "model-00004-of-00014.safetensors",
70
+ "model.layers.14.self_attn.v_proj.weight": "model-00004-of-00014.safetensors",
71
+ "model.layers.15.input_layernorm.weight": "model-00004-of-00014.safetensors",
72
+ "model.layers.15.mlp.down_proj.weight": "model-00004-of-00014.safetensors",
73
+ "model.layers.15.mlp.gate_proj.weight": "model-00004-of-00014.safetensors",
74
+ "model.layers.15.mlp.up_proj.weight": "model-00004-of-00014.safetensors",
75
+ "model.layers.15.post_attention_layernorm.weight": "model-00004-of-00014.safetensors",
76
+ "model.layers.15.self_attn.k_proj.weight": "model-00004-of-00014.safetensors",
77
+ "model.layers.15.self_attn.o_proj.weight": "model-00004-of-00014.safetensors",
78
+ "model.layers.15.self_attn.q_proj.weight": "model-00004-of-00014.safetensors",
79
+ "model.layers.15.self_attn.v_proj.weight": "model-00004-of-00014.safetensors",
80
+ "model.layers.16.input_layernorm.weight": "model-00004-of-00014.safetensors",
81
+ "model.layers.16.mlp.down_proj.weight": "model-00004-of-00014.safetensors",
82
+ "model.layers.16.mlp.gate_proj.weight": "model-00004-of-00014.safetensors",
83
+ "model.layers.16.mlp.up_proj.weight": "model-00004-of-00014.safetensors",
84
+ "model.layers.16.post_attention_layernorm.weight": "model-00004-of-00014.safetensors",
85
+ "model.layers.16.self_attn.k_proj.weight": "model-00004-of-00014.safetensors",
86
+ "model.layers.16.self_attn.o_proj.weight": "model-00004-of-00014.safetensors",
87
+ "model.layers.16.self_attn.q_proj.weight": "model-00004-of-00014.safetensors",
88
+ "model.layers.16.self_attn.v_proj.weight": "model-00004-of-00014.safetensors",
89
+ "model.layers.17.input_layernorm.weight": "model-00005-of-00014.safetensors",
90
+ "model.layers.17.mlp.down_proj.weight": "model-00005-of-00014.safetensors",
91
+ "model.layers.17.mlp.gate_proj.weight": "model-00004-of-00014.safetensors",
92
+ "model.layers.17.mlp.up_proj.weight": "model-00004-of-00014.safetensors",
93
+ "model.layers.17.post_attention_layernorm.weight": "model-00005-of-00014.safetensors",
94
+ "model.layers.17.self_attn.k_proj.weight": "model-00004-of-00014.safetensors",
95
+ "model.layers.17.self_attn.o_proj.weight": "model-00004-of-00014.safetensors",
96
+ "model.layers.17.self_attn.q_proj.weight": "model-00004-of-00014.safetensors",
97
+ "model.layers.17.self_attn.v_proj.weight": "model-00004-of-00014.safetensors",
98
+ "model.layers.18.input_layernorm.weight": "model-00005-of-00014.safetensors",
99
+ "model.layers.18.mlp.down_proj.weight": "model-00005-of-00014.safetensors",
100
+ "model.layers.18.mlp.gate_proj.weight": "model-00005-of-00014.safetensors",
101
+ "model.layers.18.mlp.up_proj.weight": "model-00005-of-00014.safetensors",
102
+ "model.layers.18.post_attention_layernorm.weight": "model-00005-of-00014.safetensors",
103
+ "model.layers.18.self_attn.k_proj.weight": "model-00005-of-00014.safetensors",
104
+ "model.layers.18.self_attn.o_proj.weight": "model-00005-of-00014.safetensors",
105
+ "model.layers.18.self_attn.q_proj.weight": "model-00005-of-00014.safetensors",
106
+ "model.layers.18.self_attn.v_proj.weight": "model-00005-of-00014.safetensors",
107
+ "model.layers.19.input_layernorm.weight": "model-00005-of-00014.safetensors",
108
+ "model.layers.19.mlp.down_proj.weight": "model-00005-of-00014.safetensors",
109
+ "model.layers.19.mlp.gate_proj.weight": "model-00005-of-00014.safetensors",
110
+ "model.layers.19.mlp.up_proj.weight": "model-00005-of-00014.safetensors",
111
+ "model.layers.19.post_attention_layernorm.weight": "model-00005-of-00014.safetensors",
112
+ "model.layers.19.self_attn.k_proj.weight": "model-00005-of-00014.safetensors",
113
+ "model.layers.19.self_attn.o_proj.weight": "model-00005-of-00014.safetensors",
114
+ "model.layers.19.self_attn.q_proj.weight": "model-00005-of-00014.safetensors",
115
+ "model.layers.19.self_attn.v_proj.weight": "model-00005-of-00014.safetensors",
116
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00014.safetensors",
117
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00014.safetensors",
118
+ "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00014.safetensors",
119
+ "model.layers.2.mlp.up_proj.weight": "model-00001-of-00014.safetensors",
120
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00014.safetensors",
121
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00014.safetensors",
122
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00014.safetensors",
123
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00014.safetensors",
124
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00014.safetensors",
125
+ "model.layers.20.input_layernorm.weight": "model-00005-of-00014.safetensors",
126
+ "model.layers.20.mlp.down_proj.weight": "model-00005-of-00014.safetensors",
127
+ "model.layers.20.mlp.gate_proj.weight": "model-00005-of-00014.safetensors",
128
+ "model.layers.20.mlp.up_proj.weight": "model-00005-of-00014.safetensors",
129
+ "model.layers.20.post_attention_layernorm.weight": "model-00005-of-00014.safetensors",
130
+ "model.layers.20.self_attn.k_proj.weight": "model-00005-of-00014.safetensors",
131
+ "model.layers.20.self_attn.o_proj.weight": "model-00005-of-00014.safetensors",
132
+ "model.layers.20.self_attn.q_proj.weight": "model-00005-of-00014.safetensors",
133
+ "model.layers.20.self_attn.v_proj.weight": "model-00005-of-00014.safetensors",
134
+ "model.layers.21.input_layernorm.weight": "model-00005-of-00014.safetensors",
135
+ "model.layers.21.mlp.down_proj.weight": "model-00005-of-00014.safetensors",
136
+ "model.layers.21.mlp.gate_proj.weight": "model-00005-of-00014.safetensors",
137
+ "model.layers.21.mlp.up_proj.weight": "model-00005-of-00014.safetensors",
138
+ "model.layers.21.post_attention_layernorm.weight": "model-00005-of-00014.safetensors",
139
+ "model.layers.21.self_attn.k_proj.weight": "model-00005-of-00014.safetensors",
140
+ "model.layers.21.self_attn.o_proj.weight": "model-00005-of-00014.safetensors",
141
+ "model.layers.21.self_attn.q_proj.weight": "model-00005-of-00014.safetensors",
142
+ "model.layers.21.self_attn.v_proj.weight": "model-00005-of-00014.safetensors",
143
+ "model.layers.22.input_layernorm.weight": "model-00006-of-00014.safetensors",
144
+ "model.layers.22.mlp.down_proj.weight": "model-00006-of-00014.safetensors",
145
+ "model.layers.22.mlp.gate_proj.weight": "model-00006-of-00014.safetensors",
146
+ "model.layers.22.mlp.up_proj.weight": "model-00006-of-00014.safetensors",
147
+ "model.layers.22.post_attention_layernorm.weight": "model-00006-of-00014.safetensors",
148
+ "model.layers.22.self_attn.k_proj.weight": "model-00005-of-00014.safetensors",
149
+ "model.layers.22.self_attn.o_proj.weight": "model-00005-of-00014.safetensors",
150
+ "model.layers.22.self_attn.q_proj.weight": "model-00005-of-00014.safetensors",
151
+ "model.layers.22.self_attn.v_proj.weight": "model-00005-of-00014.safetensors",
152
+ "model.layers.23.input_layernorm.weight": "model-00006-of-00014.safetensors",
153
+ "model.layers.23.mlp.down_proj.weight": "model-00006-of-00014.safetensors",
154
+ "model.layers.23.mlp.gate_proj.weight": "model-00006-of-00014.safetensors",
155
+ "model.layers.23.mlp.up_proj.weight": "model-00006-of-00014.safetensors",
156
+ "model.layers.23.post_attention_layernorm.weight": "model-00006-of-00014.safetensors",
157
+ "model.layers.23.self_attn.k_proj.weight": "model-00006-of-00014.safetensors",
158
+ "model.layers.23.self_attn.o_proj.weight": "model-00006-of-00014.safetensors",
159
+ "model.layers.23.self_attn.q_proj.weight": "model-00006-of-00014.safetensors",
160
+ "model.layers.23.self_attn.v_proj.weight": "model-00006-of-00014.safetensors",
161
+ "model.layers.24.input_layernorm.weight": "model-00006-of-00014.safetensors",
162
+ "model.layers.24.mlp.down_proj.weight": "model-00006-of-00014.safetensors",
163
+ "model.layers.24.mlp.gate_proj.weight": "model-00006-of-00014.safetensors",
164
+ "model.layers.24.mlp.up_proj.weight": "model-00006-of-00014.safetensors",
165
+ "model.layers.24.post_attention_layernorm.weight": "model-00006-of-00014.safetensors",
166
+ "model.layers.24.self_attn.k_proj.weight": "model-00006-of-00014.safetensors",
167
+ "model.layers.24.self_attn.o_proj.weight": "model-00006-of-00014.safetensors",
168
+ "model.layers.24.self_attn.q_proj.weight": "model-00006-of-00014.safetensors",
169
+ "model.layers.24.self_attn.v_proj.weight": "model-00006-of-00014.safetensors",
170
+ "model.layers.25.input_layernorm.weight": "model-00006-of-00014.safetensors",
171
+ "model.layers.25.mlp.down_proj.weight": "model-00006-of-00014.safetensors",
172
+ "model.layers.25.mlp.gate_proj.weight": "model-00006-of-00014.safetensors",
173
+ "model.layers.25.mlp.up_proj.weight": "model-00006-of-00014.safetensors",
174
+ "model.layers.25.post_attention_layernorm.weight": "model-00006-of-00014.safetensors",
175
+ "model.layers.25.self_attn.k_proj.weight": "model-00006-of-00014.safetensors",
176
+ "model.layers.25.self_attn.o_proj.weight": "model-00006-of-00014.safetensors",
177
+ "model.layers.25.self_attn.q_proj.weight": "model-00006-of-00014.safetensors",
178
+ "model.layers.25.self_attn.v_proj.weight": "model-00006-of-00014.safetensors",
179
+ "model.layers.26.input_layernorm.weight": "model-00007-of-00014.safetensors",
180
+ "model.layers.26.mlp.down_proj.weight": "model-00007-of-00014.safetensors",
181
+ "model.layers.26.mlp.gate_proj.weight": "model-00006-of-00014.safetensors",
182
+ "model.layers.26.mlp.up_proj.weight": "model-00006-of-00014.safetensors",
183
+ "model.layers.26.post_attention_layernorm.weight": "model-00007-of-00014.safetensors",
184
+ "model.layers.26.self_attn.k_proj.weight": "model-00006-of-00014.safetensors",
185
+ "model.layers.26.self_attn.o_proj.weight": "model-00006-of-00014.safetensors",
186
+ "model.layers.26.self_attn.q_proj.weight": "model-00006-of-00014.safetensors",
187
+ "model.layers.26.self_attn.v_proj.weight": "model-00006-of-00014.safetensors",
188
+ "model.layers.27.input_layernorm.weight": "model-00007-of-00014.safetensors",
189
+ "model.layers.27.mlp.down_proj.weight": "model-00007-of-00014.safetensors",
190
+ "model.layers.27.mlp.gate_proj.weight": "model-00007-of-00014.safetensors",
191
+ "model.layers.27.mlp.up_proj.weight": "model-00007-of-00014.safetensors",
192
+ "model.layers.27.post_attention_layernorm.weight": "model-00007-of-00014.safetensors",
193
+ "model.layers.27.self_attn.k_proj.weight": "model-00007-of-00014.safetensors",
194
+ "model.layers.27.self_attn.o_proj.weight": "model-00007-of-00014.safetensors",
195
+ "model.layers.27.self_attn.q_proj.weight": "model-00007-of-00014.safetensors",
196
+ "model.layers.27.self_attn.v_proj.weight": "model-00007-of-00014.safetensors",
197
+ "model.layers.28.input_layernorm.weight": "model-00007-of-00014.safetensors",
198
+ "model.layers.28.mlp.down_proj.weight": "model-00007-of-00014.safetensors",
199
+ "model.layers.28.mlp.gate_proj.weight": "model-00007-of-00014.safetensors",
200
+ "model.layers.28.mlp.up_proj.weight": "model-00007-of-00014.safetensors",
201
+ "model.layers.28.post_attention_layernorm.weight": "model-00007-of-00014.safetensors",
202
+ "model.layers.28.self_attn.k_proj.weight": "model-00007-of-00014.safetensors",
203
+ "model.layers.28.self_attn.o_proj.weight": "model-00007-of-00014.safetensors",
204
+ "model.layers.28.self_attn.q_proj.weight": "model-00007-of-00014.safetensors",
205
+ "model.layers.28.self_attn.v_proj.weight": "model-00007-of-00014.safetensors",
206
+ "model.layers.29.input_layernorm.weight": "model-00007-of-00014.safetensors",
207
+ "model.layers.29.mlp.down_proj.weight": "model-00007-of-00014.safetensors",
208
+ "model.layers.29.mlp.gate_proj.weight": "model-00007-of-00014.safetensors",
209
+ "model.layers.29.mlp.up_proj.weight": "model-00007-of-00014.safetensors",
210
+ "model.layers.29.post_attention_layernorm.weight": "model-00007-of-00014.safetensors",
211
+ "model.layers.29.self_attn.k_proj.weight": "model-00007-of-00014.safetensors",
212
+ "model.layers.29.self_attn.o_proj.weight": "model-00007-of-00014.safetensors",
213
+ "model.layers.29.self_attn.q_proj.weight": "model-00007-of-00014.safetensors",
214
+ "model.layers.29.self_attn.v_proj.weight": "model-00007-of-00014.safetensors",
215
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00014.safetensors",
216
+ "model.layers.3.mlp.down_proj.weight": "model-00001-of-00014.safetensors",
217
+ "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00014.safetensors",
218
+ "model.layers.3.mlp.up_proj.weight": "model-00001-of-00014.safetensors",
219
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00014.safetensors",
220
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00014.safetensors",
221
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00014.safetensors",
222
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00014.safetensors",
223
+ "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00014.safetensors",
224
+ "model.layers.30.input_layernorm.weight": "model-00007-of-00014.safetensors",
225
+ "model.layers.30.mlp.down_proj.weight": "model-00007-of-00014.safetensors",
226
+ "model.layers.30.mlp.gate_proj.weight": "model-00007-of-00014.safetensors",
227
+ "model.layers.30.mlp.up_proj.weight": "model-00007-of-00014.safetensors",
228
+ "model.layers.30.post_attention_layernorm.weight": "model-00007-of-00014.safetensors",
229
+ "model.layers.30.self_attn.k_proj.weight": "model-00007-of-00014.safetensors",
230
+ "model.layers.30.self_attn.o_proj.weight": "model-00007-of-00014.safetensors",
231
+ "model.layers.30.self_attn.q_proj.weight": "model-00007-of-00014.safetensors",
232
+ "model.layers.30.self_attn.v_proj.weight": "model-00007-of-00014.safetensors",
233
+ "model.layers.31.input_layernorm.weight": "model-00008-of-00014.safetensors",
234
+ "model.layers.31.mlp.down_proj.weight": "model-00008-of-00014.safetensors",
235
+ "model.layers.31.mlp.gate_proj.weight": "model-00008-of-00014.safetensors",
236
+ "model.layers.31.mlp.up_proj.weight": "model-00008-of-00014.safetensors",
237
+ "model.layers.31.post_attention_layernorm.weight": "model-00008-of-00014.safetensors",
238
+ "model.layers.31.self_attn.k_proj.weight": "model-00007-of-00014.safetensors",
239
+ "model.layers.31.self_attn.o_proj.weight": "model-00007-of-00014.safetensors",
240
+ "model.layers.31.self_attn.q_proj.weight": "model-00007-of-00014.safetensors",
241
+ "model.layers.31.self_attn.v_proj.weight": "model-00007-of-00014.safetensors",
242
+ "model.layers.32.input_layernorm.weight": "model-00008-of-00014.safetensors",
243
+ "model.layers.32.mlp.down_proj.weight": "model-00008-of-00014.safetensors",
244
+ "model.layers.32.mlp.gate_proj.weight": "model-00008-of-00014.safetensors",
245
+ "model.layers.32.mlp.up_proj.weight": "model-00008-of-00014.safetensors",
246
+ "model.layers.32.post_attention_layernorm.weight": "model-00008-of-00014.safetensors",
247
+ "model.layers.32.self_attn.k_proj.weight": "model-00008-of-00014.safetensors",
248
+ "model.layers.32.self_attn.o_proj.weight": "model-00008-of-00014.safetensors",
249
+ "model.layers.32.self_attn.q_proj.weight": "model-00008-of-00014.safetensors",
250
+ "model.layers.32.self_attn.v_proj.weight": "model-00008-of-00014.safetensors",
251
+ "model.layers.33.input_layernorm.weight": "model-00008-of-00014.safetensors",
252
+ "model.layers.33.mlp.down_proj.weight": "model-00008-of-00014.safetensors",
253
+ "model.layers.33.mlp.gate_proj.weight": "model-00008-of-00014.safetensors",
254
+ "model.layers.33.mlp.up_proj.weight": "model-00008-of-00014.safetensors",
255
+ "model.layers.33.post_attention_layernorm.weight": "model-00008-of-00014.safetensors",
256
+ "model.layers.33.self_attn.k_proj.weight": "model-00008-of-00014.safetensors",
257
+ "model.layers.33.self_attn.o_proj.weight": "model-00008-of-00014.safetensors",
258
+ "model.layers.33.self_attn.q_proj.weight": "model-00008-of-00014.safetensors",
259
+ "model.layers.33.self_attn.v_proj.weight": "model-00008-of-00014.safetensors",
260
+ "model.layers.34.input_layernorm.weight": "model-00008-of-00014.safetensors",
261
+ "model.layers.34.mlp.down_proj.weight": "model-00008-of-00014.safetensors",
262
+ "model.layers.34.mlp.gate_proj.weight": "model-00008-of-00014.safetensors",
263
+ "model.layers.34.mlp.up_proj.weight": "model-00008-of-00014.safetensors",
264
+ "model.layers.34.post_attention_layernorm.weight": "model-00008-of-00014.safetensors",
265
+ "model.layers.34.self_attn.k_proj.weight": "model-00008-of-00014.safetensors",
266
+ "model.layers.34.self_attn.o_proj.weight": "model-00008-of-00014.safetensors",
267
+ "model.layers.34.self_attn.q_proj.weight": "model-00008-of-00014.safetensors",
268
+ "model.layers.34.self_attn.v_proj.weight": "model-00008-of-00014.safetensors",
269
+ "model.layers.35.input_layernorm.weight": "model-00009-of-00014.safetensors",
270
+ "model.layers.35.mlp.down_proj.weight": "model-00009-of-00014.safetensors",
271
+ "model.layers.35.mlp.gate_proj.weight": "model-00008-of-00014.safetensors",
272
+ "model.layers.35.mlp.up_proj.weight": "model-00008-of-00014.safetensors",
273
+ "model.layers.35.post_attention_layernorm.weight": "model-00009-of-00014.safetensors",
274
+ "model.layers.35.self_attn.k_proj.weight": "model-00008-of-00014.safetensors",
275
+ "model.layers.35.self_attn.o_proj.weight": "model-00008-of-00014.safetensors",
276
+ "model.layers.35.self_attn.q_proj.weight": "model-00008-of-00014.safetensors",
277
+ "model.layers.35.self_attn.v_proj.weight": "model-00008-of-00014.safetensors",
278
+ "model.layers.36.input_layernorm.weight": "model-00009-of-00014.safetensors",
279
+ "model.layers.36.mlp.down_proj.weight": "model-00009-of-00014.safetensors",
280
+ "model.layers.36.mlp.gate_proj.weight": "model-00009-of-00014.safetensors",
281
+ "model.layers.36.mlp.up_proj.weight": "model-00009-of-00014.safetensors",
282
+ "model.layers.36.post_attention_layernorm.weight": "model-00009-of-00014.safetensors",
283
+ "model.layers.36.self_attn.k_proj.weight": "model-00009-of-00014.safetensors",
284
+ "model.layers.36.self_attn.o_proj.weight": "model-00009-of-00014.safetensors",
285
+ "model.layers.36.self_attn.q_proj.weight": "model-00009-of-00014.safetensors",
286
+ "model.layers.36.self_attn.v_proj.weight": "model-00009-of-00014.safetensors",
287
+ "model.layers.37.input_layernorm.weight": "model-00009-of-00014.safetensors",
288
+ "model.layers.37.mlp.down_proj.weight": "model-00009-of-00014.safetensors",
289
+ "model.layers.37.mlp.gate_proj.weight": "model-00009-of-00014.safetensors",
290
+ "model.layers.37.mlp.up_proj.weight": "model-00009-of-00014.safetensors",
291
+ "model.layers.37.post_attention_layernorm.weight": "model-00009-of-00014.safetensors",
292
+ "model.layers.37.self_attn.k_proj.weight": "model-00009-of-00014.safetensors",
293
+ "model.layers.37.self_attn.o_proj.weight": "model-00009-of-00014.safetensors",
294
+ "model.layers.37.self_attn.q_proj.weight": "model-00009-of-00014.safetensors",
295
+ "model.layers.37.self_attn.v_proj.weight": "model-00009-of-00014.safetensors",
296
+ "model.layers.38.input_layernorm.weight": "model-00009-of-00014.safetensors",
297
+ "model.layers.38.mlp.down_proj.weight": "model-00009-of-00014.safetensors",
298
+ "model.layers.38.mlp.gate_proj.weight": "model-00009-of-00014.safetensors",
299
+ "model.layers.38.mlp.up_proj.weight": "model-00009-of-00014.safetensors",
300
+ "model.layers.38.post_attention_layernorm.weight": "model-00009-of-00014.safetensors",
301
+ "model.layers.38.self_attn.k_proj.weight": "model-00009-of-00014.safetensors",
302
+ "model.layers.38.self_attn.o_proj.weight": "model-00009-of-00014.safetensors",
303
+ "model.layers.38.self_attn.q_proj.weight": "model-00009-of-00014.safetensors",
304
+ "model.layers.38.self_attn.v_proj.weight": "model-00009-of-00014.safetensors",
305
+ "model.layers.39.input_layernorm.weight": "model-00009-of-00014.safetensors",
306
+ "model.layers.39.mlp.down_proj.weight": "model-00009-of-00014.safetensors",
307
+ "model.layers.39.mlp.gate_proj.weight": "model-00009-of-00014.safetensors",
308
+ "model.layers.39.mlp.up_proj.weight": "model-00009-of-00014.safetensors",
309
+ "model.layers.39.post_attention_layernorm.weight": "model-00009-of-00014.safetensors",
310
+ "model.layers.39.self_attn.k_proj.weight": "model-00009-of-00014.safetensors",
311
+ "model.layers.39.self_attn.o_proj.weight": "model-00009-of-00014.safetensors",
312
+ "model.layers.39.self_attn.q_proj.weight": "model-00009-of-00014.safetensors",
313
+ "model.layers.39.self_attn.v_proj.weight": "model-00009-of-00014.safetensors",
314
+ "model.layers.4.input_layernorm.weight": "model-00002-of-00014.safetensors",
315
+ "model.layers.4.mlp.down_proj.weight": "model-00002-of-00014.safetensors",
316
+ "model.layers.4.mlp.gate_proj.weight": "model-00002-of-00014.safetensors",
317
+ "model.layers.4.mlp.up_proj.weight": "model-00002-of-00014.safetensors",
318
+ "model.layers.4.post_attention_layernorm.weight": "model-00002-of-00014.safetensors",
319
+ "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00014.safetensors",
320
+ "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00014.safetensors",
321
+ "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00014.safetensors",
322
+ "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00014.safetensors",
323
+ "model.layers.40.input_layernorm.weight": "model-00010-of-00014.safetensors",
324
+ "model.layers.40.mlp.down_proj.weight": "model-00010-of-00014.safetensors",
325
+ "model.layers.40.mlp.gate_proj.weight": "model-00010-of-00014.safetensors",
326
+ "model.layers.40.mlp.up_proj.weight": "model-00010-of-00014.safetensors",
327
+ "model.layers.40.post_attention_layernorm.weight": "model-00010-of-00014.safetensors",
328
+ "model.layers.40.self_attn.k_proj.weight": "model-00009-of-00014.safetensors",
329
+ "model.layers.40.self_attn.o_proj.weight": "model-00009-of-00014.safetensors",
330
+ "model.layers.40.self_attn.q_proj.weight": "model-00009-of-00014.safetensors",
331
+ "model.layers.40.self_attn.v_proj.weight": "model-00009-of-00014.safetensors",
332
+ "model.layers.41.input_layernorm.weight": "model-00010-of-00014.safetensors",
333
+ "model.layers.41.mlp.down_proj.weight": "model-00010-of-00014.safetensors",
334
+ "model.layers.41.mlp.gate_proj.weight": "model-00010-of-00014.safetensors",
335
+ "model.layers.41.mlp.up_proj.weight": "model-00010-of-00014.safetensors",
336
+ "model.layers.41.post_attention_layernorm.weight": "model-00010-of-00014.safetensors",
337
+ "model.layers.41.self_attn.k_proj.weight": "model-00010-of-00014.safetensors",
338
+ "model.layers.41.self_attn.o_proj.weight": "model-00010-of-00014.safetensors",
339
+ "model.layers.41.self_attn.q_proj.weight": "model-00010-of-00014.safetensors",
340
+ "model.layers.41.self_attn.v_proj.weight": "model-00010-of-00014.safetensors",
341
+ "model.layers.42.input_layernorm.weight": "model-00010-of-00014.safetensors",
342
+ "model.layers.42.mlp.down_proj.weight": "model-00010-of-00014.safetensors",
343
+ "model.layers.42.mlp.gate_proj.weight": "model-00010-of-00014.safetensors",
344
+ "model.layers.42.mlp.up_proj.weight": "model-00010-of-00014.safetensors",
345
+ "model.layers.42.post_attention_layernorm.weight": "model-00010-of-00014.safetensors",
346
+ "model.layers.42.self_attn.k_proj.weight": "model-00010-of-00014.safetensors",
347
+ "model.layers.42.self_attn.o_proj.weight": "model-00010-of-00014.safetensors",
348
+ "model.layers.42.self_attn.q_proj.weight": "model-00010-of-00014.safetensors",
349
+ "model.layers.42.self_attn.v_proj.weight": "model-00010-of-00014.safetensors",
350
+ "model.layers.43.input_layernorm.weight": "model-00010-of-00014.safetensors",
351
+ "model.layers.43.mlp.down_proj.weight": "model-00010-of-00014.safetensors",
352
+ "model.layers.43.mlp.gate_proj.weight": "model-00010-of-00014.safetensors",
353
+ "model.layers.43.mlp.up_proj.weight": "model-00010-of-00014.safetensors",
354
+ "model.layers.43.post_attention_layernorm.weight": "model-00010-of-00014.safetensors",
355
+ "model.layers.43.self_attn.k_proj.weight": "model-00010-of-00014.safetensors",
356
+ "model.layers.43.self_attn.o_proj.weight": "model-00010-of-00014.safetensors",
357
+ "model.layers.43.self_attn.q_proj.weight": "model-00010-of-00014.safetensors",
358
+ "model.layers.43.self_attn.v_proj.weight": "model-00010-of-00014.safetensors",
359
+ "model.layers.44.input_layernorm.weight": "model-00011-of-00014.safetensors",
360
+ "model.layers.44.mlp.down_proj.weight": "model-00011-of-00014.safetensors",
361
+ "model.layers.44.mlp.gate_proj.weight": "model-00010-of-00014.safetensors",
362
+ "model.layers.44.mlp.up_proj.weight": "model-00010-of-00014.safetensors",
363
+ "model.layers.44.post_attention_layernorm.weight": "model-00011-of-00014.safetensors",
364
+ "model.layers.44.self_attn.k_proj.weight": "model-00010-of-00014.safetensors",
365
+ "model.layers.44.self_attn.o_proj.weight": "model-00010-of-00014.safetensors",
366
+ "model.layers.44.self_attn.q_proj.weight": "model-00010-of-00014.safetensors",
367
+ "model.layers.44.self_attn.v_proj.weight": "model-00010-of-00014.safetensors",
368
+ "model.layers.45.input_layernorm.weight": "model-00011-of-00014.safetensors",
369
+ "model.layers.45.mlp.down_proj.weight": "model-00011-of-00014.safetensors",
370
+ "model.layers.45.mlp.gate_proj.weight": "model-00011-of-00014.safetensors",
371
+ "model.layers.45.mlp.up_proj.weight": "model-00011-of-00014.safetensors",
372
+ "model.layers.45.post_attention_layernorm.weight": "model-00011-of-00014.safetensors",
373
+ "model.layers.45.self_attn.k_proj.weight": "model-00011-of-00014.safetensors",
374
+ "model.layers.45.self_attn.o_proj.weight": "model-00011-of-00014.safetensors",
375
+ "model.layers.45.self_attn.q_proj.weight": "model-00011-of-00014.safetensors",
376
+ "model.layers.45.self_attn.v_proj.weight": "model-00011-of-00014.safetensors",
377
+ "model.layers.46.input_layernorm.weight": "model-00011-of-00014.safetensors",
378
+ "model.layers.46.mlp.down_proj.weight": "model-00011-of-00014.safetensors",
379
+ "model.layers.46.mlp.gate_proj.weight": "model-00011-of-00014.safetensors",
380
+ "model.layers.46.mlp.up_proj.weight": "model-00011-of-00014.safetensors",
381
+ "model.layers.46.post_attention_layernorm.weight": "model-00011-of-00014.safetensors",
382
+ "model.layers.46.self_attn.k_proj.weight": "model-00011-of-00014.safetensors",
383
+ "model.layers.46.self_attn.o_proj.weight": "model-00011-of-00014.safetensors",
384
+ "model.layers.46.self_attn.q_proj.weight": "model-00011-of-00014.safetensors",
385
+ "model.layers.46.self_attn.v_proj.weight": "model-00011-of-00014.safetensors",
386
+ "model.layers.47.input_layernorm.weight": "model-00011-of-00014.safetensors",
387
+ "model.layers.47.mlp.down_proj.weight": "model-00011-of-00014.safetensors",
388
+ "model.layers.47.mlp.gate_proj.weight": "model-00011-of-00014.safetensors",
389
+ "model.layers.47.mlp.up_proj.weight": "model-00011-of-00014.safetensors",
390
+ "model.layers.47.post_attention_layernorm.weight": "model-00011-of-00014.safetensors",
391
+ "model.layers.47.self_attn.k_proj.weight": "model-00011-of-00014.safetensors",
392
+ "model.layers.47.self_attn.o_proj.weight": "model-00011-of-00014.safetensors",
393
+ "model.layers.47.self_attn.q_proj.weight": "model-00011-of-00014.safetensors",
394
+ "model.layers.47.self_attn.v_proj.weight": "model-00011-of-00014.safetensors",
395
+ "model.layers.48.input_layernorm.weight": "model-00011-of-00014.safetensors",
396
+ "model.layers.48.mlp.down_proj.weight": "model-00011-of-00014.safetensors",
397
+ "model.layers.48.mlp.gate_proj.weight": "model-00011-of-00014.safetensors",
398
+ "model.layers.48.mlp.up_proj.weight": "model-00011-of-00014.safetensors",
399
+ "model.layers.48.post_attention_layernorm.weight": "model-00011-of-00014.safetensors",
400
+ "model.layers.48.self_attn.k_proj.weight": "model-00011-of-00014.safetensors",
401
+ "model.layers.48.self_attn.o_proj.weight": "model-00011-of-00014.safetensors",
402
+ "model.layers.48.self_attn.q_proj.weight": "model-00011-of-00014.safetensors",
403
+ "model.layers.48.self_attn.v_proj.weight": "model-00011-of-00014.safetensors",
404
+ "model.layers.49.input_layernorm.weight": "model-00012-of-00014.safetensors",
405
+ "model.layers.49.mlp.down_proj.weight": "model-00012-of-00014.safetensors",
406
+ "model.layers.49.mlp.gate_proj.weight": "model-00012-of-00014.safetensors",
407
+ "model.layers.49.mlp.up_proj.weight": "model-00012-of-00014.safetensors",
408
+ "model.layers.49.post_attention_layernorm.weight": "model-00012-of-00014.safetensors",
409
+ "model.layers.49.self_attn.k_proj.weight": "model-00011-of-00014.safetensors",
410
+ "model.layers.49.self_attn.o_proj.weight": "model-00011-of-00014.safetensors",
411
+ "model.layers.49.self_attn.q_proj.weight": "model-00011-of-00014.safetensors",
412
+ "model.layers.49.self_attn.v_proj.weight": "model-00011-of-00014.safetensors",
413
+ "model.layers.5.input_layernorm.weight": "model-00002-of-00014.safetensors",
414
+ "model.layers.5.mlp.down_proj.weight": "model-00002-of-00014.safetensors",
415
+ "model.layers.5.mlp.gate_proj.weight": "model-00002-of-00014.safetensors",
416
+ "model.layers.5.mlp.up_proj.weight": "model-00002-of-00014.safetensors",
417
+ "model.layers.5.post_attention_layernorm.weight": "model-00002-of-00014.safetensors",
418
+ "model.layers.5.self_attn.k_proj.weight": "model-00002-of-00014.safetensors",
419
+ "model.layers.5.self_attn.o_proj.weight": "model-00002-of-00014.safetensors",
420
+ "model.layers.5.self_attn.q_proj.weight": "model-00002-of-00014.safetensors",
421
+ "model.layers.5.self_attn.v_proj.weight": "model-00002-of-00014.safetensors",
422
+ "model.layers.50.input_layernorm.weight": "model-00012-of-00014.safetensors",
423
+ "model.layers.50.mlp.down_proj.weight": "model-00012-of-00014.safetensors",
424
+ "model.layers.50.mlp.gate_proj.weight": "model-00012-of-00014.safetensors",
425
+ "model.layers.50.mlp.up_proj.weight": "model-00012-of-00014.safetensors",
426
+ "model.layers.50.post_attention_layernorm.weight": "model-00012-of-00014.safetensors",
427
+ "model.layers.50.self_attn.k_proj.weight": "model-00012-of-00014.safetensors",
428
+ "model.layers.50.self_attn.o_proj.weight": "model-00012-of-00014.safetensors",
429
+ "model.layers.50.self_attn.q_proj.weight": "model-00012-of-00014.safetensors",
430
+ "model.layers.50.self_attn.v_proj.weight": "model-00012-of-00014.safetensors",
431
+ "model.layers.51.input_layernorm.weight": "model-00012-of-00014.safetensors",
432
+ "model.layers.51.mlp.down_proj.weight": "model-00012-of-00014.safetensors",
433
+ "model.layers.51.mlp.gate_proj.weight": "model-00012-of-00014.safetensors",
434
+ "model.layers.51.mlp.up_proj.weight": "model-00012-of-00014.safetensors",
435
+ "model.layers.51.post_attention_layernorm.weight": "model-00012-of-00014.safetensors",
436
+ "model.layers.51.self_attn.k_proj.weight": "model-00012-of-00014.safetensors",
437
+ "model.layers.51.self_attn.o_proj.weight": "model-00012-of-00014.safetensors",
438
+ "model.layers.51.self_attn.q_proj.weight": "model-00012-of-00014.safetensors",
439
+ "model.layers.51.self_attn.v_proj.weight": "model-00012-of-00014.safetensors",
440
+ "model.layers.52.input_layernorm.weight": "model-00012-of-00014.safetensors",
441
+ "model.layers.52.mlp.down_proj.weight": "model-00012-of-00014.safetensors",
442
+ "model.layers.52.mlp.gate_proj.weight": "model-00012-of-00014.safetensors",
443
+ "model.layers.52.mlp.up_proj.weight": "model-00012-of-00014.safetensors",
444
+ "model.layers.52.post_attention_layernorm.weight": "model-00012-of-00014.safetensors",
445
+ "model.layers.52.self_attn.k_proj.weight": "model-00012-of-00014.safetensors",
446
+ "model.layers.52.self_attn.o_proj.weight": "model-00012-of-00014.safetensors",
447
+ "model.layers.52.self_attn.q_proj.weight": "model-00012-of-00014.safetensors",
448
+ "model.layers.52.self_attn.v_proj.weight": "model-00012-of-00014.safetensors",
449
+ "model.layers.53.input_layernorm.weight": "model-00013-of-00014.safetensors",
450
+ "model.layers.53.mlp.down_proj.weight": "model-00013-of-00014.safetensors",
451
+ "model.layers.53.mlp.gate_proj.weight": "model-00012-of-00014.safetensors",
452
+ "model.layers.53.mlp.up_proj.weight": "model-00012-of-00014.safetensors",
453
+ "model.layers.53.post_attention_layernorm.weight": "model-00013-of-00014.safetensors",
454
+ "model.layers.53.self_attn.k_proj.weight": "model-00012-of-00014.safetensors",
455
+ "model.layers.53.self_attn.o_proj.weight": "model-00012-of-00014.safetensors",
456
+ "model.layers.53.self_attn.q_proj.weight": "model-00012-of-00014.safetensors",
457
+ "model.layers.53.self_attn.v_proj.weight": "model-00012-of-00014.safetensors",
458
+ "model.layers.54.input_layernorm.weight": "model-00013-of-00014.safetensors",
459
+ "model.layers.54.mlp.down_proj.weight": "model-00013-of-00014.safetensors",
460
+ "model.layers.54.mlp.gate_proj.weight": "model-00013-of-00014.safetensors",
461
+ "model.layers.54.mlp.up_proj.weight": "model-00013-of-00014.safetensors",
462
+ "model.layers.54.post_attention_layernorm.weight": "model-00013-of-00014.safetensors",
463
+ "model.layers.54.self_attn.k_proj.weight": "model-00013-of-00014.safetensors",
464
+ "model.layers.54.self_attn.o_proj.weight": "model-00013-of-00014.safetensors",
465
+ "model.layers.54.self_attn.q_proj.weight": "model-00013-of-00014.safetensors",
466
+ "model.layers.54.self_attn.v_proj.weight": "model-00013-of-00014.safetensors",
467
+ "model.layers.55.input_layernorm.weight": "model-00013-of-00014.safetensors",
468
+ "model.layers.55.mlp.down_proj.weight": "model-00013-of-00014.safetensors",
469
+ "model.layers.55.mlp.gate_proj.weight": "model-00013-of-00014.safetensors",
470
+ "model.layers.55.mlp.up_proj.weight": "model-00013-of-00014.safetensors",
471
+ "model.layers.55.post_attention_layernorm.weight": "model-00013-of-00014.safetensors",
472
+ "model.layers.55.self_attn.k_proj.weight": "model-00013-of-00014.safetensors",
473
+ "model.layers.55.self_attn.o_proj.weight": "model-00013-of-00014.safetensors",
474
+ "model.layers.55.self_attn.q_proj.weight": "model-00013-of-00014.safetensors",
475
+ "model.layers.55.self_attn.v_proj.weight": "model-00013-of-00014.safetensors",
476
+ "model.layers.56.input_layernorm.weight": "model-00013-of-00014.safetensors",
477
+ "model.layers.56.mlp.down_proj.weight": "model-00013-of-00014.safetensors",
478
+ "model.layers.56.mlp.gate_proj.weight": "model-00013-of-00014.safetensors",
479
+ "model.layers.56.mlp.up_proj.weight": "model-00013-of-00014.safetensors",
480
+ "model.layers.56.post_attention_layernorm.weight": "model-00013-of-00014.safetensors",
481
+ "model.layers.56.self_attn.k_proj.weight": "model-00013-of-00014.safetensors",
482
+ "model.layers.56.self_attn.o_proj.weight": "model-00013-of-00014.safetensors",
483
+ "model.layers.56.self_attn.q_proj.weight": "model-00013-of-00014.safetensors",
484
+ "model.layers.56.self_attn.v_proj.weight": "model-00013-of-00014.safetensors",
485
+ "model.layers.57.input_layernorm.weight": "model-00013-of-00014.safetensors",
486
+ "model.layers.57.mlp.down_proj.weight": "model-00013-of-00014.safetensors",
487
+ "model.layers.57.mlp.gate_proj.weight": "model-00013-of-00014.safetensors",
488
+ "model.layers.57.mlp.up_proj.weight": "model-00013-of-00014.safetensors",
489
+ "model.layers.57.post_attention_layernorm.weight": "model-00013-of-00014.safetensors",
490
+ "model.layers.57.self_attn.k_proj.weight": "model-00013-of-00014.safetensors",
491
+ "model.layers.57.self_attn.o_proj.weight": "model-00013-of-00014.safetensors",
492
+ "model.layers.57.self_attn.q_proj.weight": "model-00013-of-00014.safetensors",
493
+ "model.layers.57.self_attn.v_proj.weight": "model-00013-of-00014.safetensors",
494
+ "model.layers.58.input_layernorm.weight": "model-00014-of-00014.safetensors",
495
+ "model.layers.58.mlp.down_proj.weight": "model-00014-of-00014.safetensors",
496
+ "model.layers.58.mlp.gate_proj.weight": "model-00014-of-00014.safetensors",
497
+ "model.layers.58.mlp.up_proj.weight": "model-00014-of-00014.safetensors",
498
+ "model.layers.58.post_attention_layernorm.weight": "model-00014-of-00014.safetensors",
499
+ "model.layers.58.self_attn.k_proj.weight": "model-00013-of-00014.safetensors",
500
+ "model.layers.58.self_attn.o_proj.weight": "model-00013-of-00014.safetensors",
501
+ "model.layers.58.self_attn.q_proj.weight": "model-00013-of-00014.safetensors",
502
+ "model.layers.58.self_attn.v_proj.weight": "model-00013-of-00014.safetensors",
503
+ "model.layers.59.input_layernorm.weight": "model-00014-of-00014.safetensors",
504
+ "model.layers.59.mlp.down_proj.weight": "model-00014-of-00014.safetensors",
505
+ "model.layers.59.mlp.gate_proj.weight": "model-00014-of-00014.safetensors",
506
+ "model.layers.59.mlp.up_proj.weight": "model-00014-of-00014.safetensors",
507
+ "model.layers.59.post_attention_layernorm.weight": "model-00014-of-00014.safetensors",
508
+ "model.layers.59.self_attn.k_proj.weight": "model-00014-of-00014.safetensors",
509
+ "model.layers.59.self_attn.o_proj.weight": "model-00014-of-00014.safetensors",
510
+ "model.layers.59.self_attn.q_proj.weight": "model-00014-of-00014.safetensors",
511
+ "model.layers.59.self_attn.v_proj.weight": "model-00014-of-00014.safetensors",
512
+ "model.layers.6.input_layernorm.weight": "model-00002-of-00014.safetensors",
513
+ "model.layers.6.mlp.down_proj.weight": "model-00002-of-00014.safetensors",
514
+ "model.layers.6.mlp.gate_proj.weight": "model-00002-of-00014.safetensors",
515
+ "model.layers.6.mlp.up_proj.weight": "model-00002-of-00014.safetensors",
516
+ "model.layers.6.post_attention_layernorm.weight": "model-00002-of-00014.safetensors",
517
+ "model.layers.6.self_attn.k_proj.weight": "model-00002-of-00014.safetensors",
518
+ "model.layers.6.self_attn.o_proj.weight": "model-00002-of-00014.safetensors",
519
+ "model.layers.6.self_attn.q_proj.weight": "model-00002-of-00014.safetensors",
520
+ "model.layers.6.self_attn.v_proj.weight": "model-00002-of-00014.safetensors",
521
+ "model.layers.60.input_layernorm.weight": "model-00014-of-00014.safetensors",
522
+ "model.layers.60.mlp.down_proj.weight": "model-00014-of-00014.safetensors",
523
+ "model.layers.60.mlp.gate_proj.weight": "model-00014-of-00014.safetensors",
524
+ "model.layers.60.mlp.up_proj.weight": "model-00014-of-00014.safetensors",
525
+ "model.layers.60.post_attention_layernorm.weight": "model-00014-of-00014.safetensors",
526
+ "model.layers.60.self_attn.k_proj.weight": "model-00014-of-00014.safetensors",
527
+ "model.layers.60.self_attn.o_proj.weight": "model-00014-of-00014.safetensors",
528
+ "model.layers.60.self_attn.q_proj.weight": "model-00014-of-00014.safetensors",
529
+ "model.layers.60.self_attn.v_proj.weight": "model-00014-of-00014.safetensors",
530
+ "model.layers.61.input_layernorm.weight": "model-00014-of-00014.safetensors",
531
+ "model.layers.61.mlp.down_proj.weight": "model-00014-of-00014.safetensors",
532
+ "model.layers.61.mlp.gate_proj.weight": "model-00014-of-00014.safetensors",
533
+ "model.layers.61.mlp.up_proj.weight": "model-00014-of-00014.safetensors",
534
+ "model.layers.61.post_attention_layernorm.weight": "model-00014-of-00014.safetensors",
535
+ "model.layers.61.self_attn.k_proj.weight": "model-00014-of-00014.safetensors",
536
+ "model.layers.61.self_attn.o_proj.weight": "model-00014-of-00014.safetensors",
537
+ "model.layers.61.self_attn.q_proj.weight": "model-00014-of-00014.safetensors",
538
+ "model.layers.61.self_attn.v_proj.weight": "model-00014-of-00014.safetensors",
539
+ "model.layers.7.input_layernorm.weight": "model-00002-of-00014.safetensors",
540
+ "model.layers.7.mlp.down_proj.weight": "model-00002-of-00014.safetensors",
541
+ "model.layers.7.mlp.gate_proj.weight": "model-00002-of-00014.safetensors",
542
+ "model.layers.7.mlp.up_proj.weight": "model-00002-of-00014.safetensors",
543
+ "model.layers.7.post_attention_layernorm.weight": "model-00002-of-00014.safetensors",
544
+ "model.layers.7.self_attn.k_proj.weight": "model-00002-of-00014.safetensors",
545
+ "model.layers.7.self_attn.o_proj.weight": "model-00002-of-00014.safetensors",
546
+ "model.layers.7.self_attn.q_proj.weight": "model-00002-of-00014.safetensors",
547
+ "model.layers.7.self_attn.v_proj.weight": "model-00002-of-00014.safetensors",
548
+ "model.layers.8.input_layernorm.weight": "model-00003-of-00014.safetensors",
549
+ "model.layers.8.mlp.down_proj.weight": "model-00003-of-00014.safetensors",
550
+ "model.layers.8.mlp.gate_proj.weight": "model-00002-of-00014.safetensors",
551
+ "model.layers.8.mlp.up_proj.weight": "model-00002-of-00014.safetensors",
552
+ "model.layers.8.post_attention_layernorm.weight": "model-00003-of-00014.safetensors",
553
+ "model.layers.8.self_attn.k_proj.weight": "model-00002-of-00014.safetensors",
554
+ "model.layers.8.self_attn.o_proj.weight": "model-00002-of-00014.safetensors",
555
+ "model.layers.8.self_attn.q_proj.weight": "model-00002-of-00014.safetensors",
556
+ "model.layers.8.self_attn.v_proj.weight": "model-00002-of-00014.safetensors",
557
+ "model.layers.9.input_layernorm.weight": "model-00003-of-00014.safetensors",
558
+ "model.layers.9.mlp.down_proj.weight": "model-00003-of-00014.safetensors",
559
+ "model.layers.9.mlp.gate_proj.weight": "model-00003-of-00014.safetensors",
560
+ "model.layers.9.mlp.up_proj.weight": "model-00003-of-00014.safetensors",
561
+ "model.layers.9.post_attention_layernorm.weight": "model-00003-of-00014.safetensors",
562
+ "model.layers.9.self_attn.k_proj.weight": "model-00003-of-00014.safetensors",
563
+ "model.layers.9.self_attn.o_proj.weight": "model-00003-of-00014.safetensors",
564
+ "model.layers.9.self_attn.q_proj.weight": "model-00003-of-00014.safetensors",
565
+ "model.layers.9.self_attn.v_proj.weight": "model-00003-of-00014.safetensors",
566
+ "model.norm.weight": "model-00014-of-00014.safetensors"
567
+ }
568
+ }
output-00001-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b3216a320248c1924b545bca4c917a221c940333ad2dc009cc73b4a3cccb2da8
3
+ size 8588800098
output-00002-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dd460e7d8a4403a5c2e5414346968107c28455bc4cb8c5604e31b5e06ea50ad9
3
+ size 8575479736
output-00003-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:36794f75de7c88438ccb58f496701c524c55b739e198c7309e9ee14105fb26c3
3
+ size 4046979814
special_tokens_map.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|begin▁of▁sentence|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|end▁of▁sentence|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<|begin▁of▁sentence|>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "unk_token": {
24
+ "content": "<unk>",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ }
30
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,201 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "added_tokens_decoder": {
5
+ "32000": {
6
+ "content": "õ",
7
+ "lstrip": false,
8
+ "normalized": true,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": false
12
+ },
13
+ "32001": {
14
+ "content": "÷",
15
+ "lstrip": false,
16
+ "normalized": true,
17
+ "rstrip": false,
18
+ "single_word": false,
19
+ "special": false
20
+ },
21
+ "32002": {
22
+ "content": "Á",
23
+ "lstrip": false,
24
+ "normalized": true,
25
+ "rstrip": false,
26
+ "single_word": false,
27
+ "special": false
28
+ },
29
+ "32003": {
30
+ "content": "ý",
31
+ "lstrip": false,
32
+ "normalized": true,
33
+ "rstrip": false,
34
+ "single_word": false,
35
+ "special": false
36
+ },
37
+ "32004": {
38
+ "content": "À",
39
+ "lstrip": false,
40
+ "normalized": true,
41
+ "rstrip": false,
42
+ "single_word": false,
43
+ "special": false
44
+ },
45
+ "32005": {
46
+ "content": "ÿ",
47
+ "lstrip": false,
48
+ "normalized": true,
49
+ "rstrip": false,
50
+ "single_word": false,
51
+ "special": false
52
+ },
53
+ "32006": {
54
+ "content": "ø",
55
+ "lstrip": false,
56
+ "normalized": true,
57
+ "rstrip": false,
58
+ "single_word": false,
59
+ "special": false
60
+ },
61
+ "32007": {
62
+ "content": "ú",
63
+ "lstrip": false,
64
+ "normalized": true,
65
+ "rstrip": false,
66
+ "single_word": false,
67
+ "special": false
68
+ },
69
+ "32008": {
70
+ "content": "þ",
71
+ "lstrip": false,
72
+ "normalized": true,
73
+ "rstrip": false,
74
+ "single_word": false,
75
+ "special": false
76
+ },
77
+ "32009": {
78
+ "content": "ü",
79
+ "lstrip": false,
80
+ "normalized": true,
81
+ "rstrip": false,
82
+ "single_word": false,
83
+ "special": false
84
+ },
85
+ "32010": {
86
+ "content": "ù",
87
+ "lstrip": false,
88
+ "normalized": true,
89
+ "rstrip": false,
90
+ "single_word": false,
91
+ "special": false
92
+ },
93
+ "32011": {
94
+ "content": "ö",
95
+ "lstrip": false,
96
+ "normalized": true,
97
+ "rstrip": false,
98
+ "single_word": false,
99
+ "special": false
100
+ },
101
+ "32012": {
102
+ "content": "û",
103
+ "lstrip": false,
104
+ "normalized": true,
105
+ "rstrip": false,
106
+ "single_word": false,
107
+ "special": false
108
+ },
109
+ "32013": {
110
+ "content": "<|begin▁of▁sentence|>",
111
+ "lstrip": false,
112
+ "normalized": false,
113
+ "rstrip": false,
114
+ "single_word": false,
115
+ "special": true
116
+ },
117
+ "32014": {
118
+ "content": "<|end▁of▁sentence|>",
119
+ "lstrip": false,
120
+ "normalized": false,
121
+ "rstrip": false,
122
+ "single_word": false,
123
+ "special": true
124
+ },
125
+ "32015": {
126
+ "content": "<|fim▁hole|>",
127
+ "lstrip": false,
128
+ "normalized": true,
129
+ "rstrip": false,
130
+ "single_word": false,
131
+ "special": false
132
+ },
133
+ "32016": {
134
+ "content": "<|fim▁begin|>",
135
+ "lstrip": false,
136
+ "normalized": true,
137
+ "rstrip": false,
138
+ "single_word": false,
139
+ "special": false
140
+ },
141
+ "32017": {
142
+ "content": "<|fim▁end|>",
143
+ "lstrip": false,
144
+ "normalized": true,
145
+ "rstrip": false,
146
+ "single_word": false,
147
+ "special": false
148
+ },
149
+ "32018": {
150
+ "content": "<pad>",
151
+ "lstrip": false,
152
+ "normalized": true,
153
+ "rstrip": false,
154
+ "single_word": false,
155
+ "special": false
156
+ },
157
+ "32019": {
158
+ "content": "<|User|>",
159
+ "lstrip": false,
160
+ "normalized": true,
161
+ "rstrip": false,
162
+ "single_word": false,
163
+ "special": false
164
+ },
165
+ "32020": {
166
+ "content": "<|Assistant|>",
167
+ "lstrip": false,
168
+ "normalized": true,
169
+ "rstrip": false,
170
+ "single_word": false,
171
+ "special": false
172
+ },
173
+ "32021": {
174
+ "content": "<|EOT|>",
175
+ "lstrip": false,
176
+ "normalized": true,
177
+ "rstrip": false,
178
+ "single_word": false,
179
+ "special": false
180
+ },
181
+ "32022": {
182
+ "content": "<unk>",
183
+ "lstrip": false,
184
+ "normalized": false,
185
+ "rstrip": false,
186
+ "single_word": false,
187
+ "special": true
188
+ }
189
+ },
190
+ "bos_token": "<|begin▁of▁sentence|>",
191
+ "clean_up_tokenization_spaces": false,
192
+ "eos_token": "<|end▁of▁sentence|>",
193
+ "legacy": true,
194
+ "model_max_length": 16384,
195
+ "pad_token": "<|begin▁of▁sentence|>",
196
+ "padding_side": "right",
197
+ "sp_model_kwargs": {},
198
+ "tokenizer_class": "LlamaTokenizer",
199
+ "unk_token": "<unk>",
200
+ "use_default_system_prompt": false
201
+ }