Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,20 @@
|
|
| 1 |
---
|
| 2 |
license: mit
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
license: mit
|
| 3 |
+
datasets:
|
| 4 |
+
- TokenBender/code_instructions_122k_alpaca_style
|
| 5 |
+
language:
|
| 6 |
+
- en
|
| 7 |
+
tags:
|
| 8 |
+
- code
|
| 9 |
+
- nlp
|
| 10 |
---
|
| 11 |
+
## Model Summary
|
| 12 |
+
|
| 13 |
+
CodePhi2 is finetuning of the Microsoft Phi-2 LLM with **2.7 billion** parameters. It was finetuned on TokenBender's [code_instructions_122k_alpaca_style]("https://huggingface.co/datasets/TokenBender/code_instructions_122k_alpaca_style"). The end goal was to increase Phi-2's coding ability while imbuing the Alpaca format.
|
| 14 |
+
|
| 15 |
+
## Benchmarks
|
| 16 |
+
|
| 17 |
+
Coming Soon.
|
| 18 |
+
|
| 19 |
+
#### Notes
|
| 20 |
+
If you are using transformers>=4.36.0, always load the model with trust_remote_code=True to prevent side-effects.
|