Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,52 @@
|
|
1 |
---
|
2 |
license: cc-by-sa-4.0
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: cc-by-sa-4.0
|
3 |
+
datasets:
|
4 |
+
- izumi-lab/llm-japanese-dataset
|
5 |
+
language:
|
6 |
+
- ja
|
7 |
+
tags:
|
8 |
+
- llama
|
9 |
+
- causal-lm
|
10 |
---
|
11 |
+
|
12 |
+
This repo contains a low-rank adapter for LLaMA-7b
|
13 |
+
fit on the [llm-japanese-dataset](https://github.com/masanorihirano/llm-japanese-dataset) dataset.
|
14 |
+
|
15 |
+
This version of the weights was trained with the following hyperparameters:
|
16 |
+
|
17 |
+
- Epochs: 1
|
18 |
+
- Batch size: 128
|
19 |
+
- Cutoff length: 256
|
20 |
+
- Learning rate: 3e-4
|
21 |
+
- Lora _r_: 4
|
22 |
+
- Lora target modules: q_proj, v_proj
|
23 |
+
|
24 |
+
```python
|
25 |
+
import torch
|
26 |
+
from transformers import LlamaForCausalLM, LlamaTokenizer
|
27 |
+
from peft import PeftModel
|
28 |
+
|
29 |
+
base_model = "decapoda-research/llama-7b-hf"
|
30 |
+
# Please note that the special license of decapoda-research/llama-7b-hf is applied.
|
31 |
+
model = LlamaForCausalLM.from_pretrained(base_model, torch_dtype=torch.float16)
|
32 |
+
tokenizer = LlamaTokenizer.from_pretrained(base_model)
|
33 |
+
model = PeftModel.from_pretrained(
|
34 |
+
model,
|
35 |
+
"izumi-lab/llama-7b-japanese-lora-v0",
|
36 |
+
torch_dtype=torch.float16,
|
37 |
+
)
|
38 |
+
```
|
39 |
+
|
40 |
+
To see more latest information, please go to [llm.msuzuki.me](https://llm.msuzuki.me).
|
41 |
+
|
42 |
+
## Details
|
43 |
+
|
44 |
+
- Japanese Paper:
|
45 |
+
- English Paper:
|
46 |
+
- GitHub:
|
47 |
+
- Website: [llm.msuzuki.me](https://llm.msuzuki.me).
|
48 |
+
|
49 |
+
Citation:
|
50 |
+
|
51 |
+
|
52 |
+
If you have any inquiries, such as joint research, data provision, various types of support, please email to [email protected] .
|