Improve model card by adding the appropriate tags, link to code and paper (#4)
Browse files- Improve model card by adding the appropriate tags, link to code and paper (d10ce244d2a169a949c926d88fa67f761b5fd1a0)
Co-authored-by: Niels Rogge <[email protected]>
README.md
CHANGED
@@ -1,5 +1,6 @@
|
|
1 |
---
|
2 |
-
|
|
|
3 |
language:
|
4 |
- en
|
5 |
- zh
|
@@ -9,11 +10,11 @@ language:
|
|
9 |
- ar
|
10 |
- es
|
11 |
- pt
|
|
|
12 |
metrics:
|
13 |
- accuracy
|
14 |
-
base_model:
|
15 |
-
- BlinkDL/rwkv-7-world
|
16 |
pipeline_tag: text-generation
|
|
|
17 |
---
|
18 |
|
19 |
# rwkv7-1.5B-world
|
@@ -32,7 +33,7 @@ This is RWKV-7 model under flash-linear attention format.
|
|
32 |
- **Developed by:** Bo Peng, Yu Zhang, Songlin Yang, Ruichong Zhang
|
33 |
- **Funded by:** RWKV Project (Under LF AI & Data Foundation)
|
34 |
- **Model type:** RWKV7
|
35 |
-
- **Language(s) (NLP):** English
|
36 |
- **License:** Apache-2.0
|
37 |
- **Parameter count:** 1.52B
|
38 |
- **Tokenizer:** RWKV World tokenizer
|
@@ -43,7 +44,7 @@ This is RWKV-7 model under flash-linear attention format.
|
|
43 |
<!-- Provide the basic links for the model. -->
|
44 |
|
45 |
- **Repository:** https://github.com/fla-org/flash-linear-attention ; https://github.com/BlinkDL/RWKV-LM
|
46 |
-
- **Paper:** https://
|
47 |
|
48 |
## Uses
|
49 |
|
|
|
1 |
---
|
2 |
+
base_model:
|
3 |
+
- BlinkDL/rwkv-7-world
|
4 |
language:
|
5 |
- en
|
6 |
- zh
|
|
|
10 |
- ar
|
11 |
- es
|
12 |
- pt
|
13 |
+
license: apache-2.0
|
14 |
metrics:
|
15 |
- accuracy
|
|
|
|
|
16 |
pipeline_tag: text-generation
|
17 |
+
library_name: transformers
|
18 |
---
|
19 |
|
20 |
# rwkv7-1.5B-world
|
|
|
33 |
- **Developed by:** Bo Peng, Yu Zhang, Songlin Yang, Ruichong Zhang
|
34 |
- **Funded by:** RWKV Project (Under LF AI & Data Foundation)
|
35 |
- **Model type:** RWKV7
|
36 |
+
- **Language(s) (NLP):** English, Chinese, Japanese, Korean, French, Arabic, Spanish, Portuguese
|
37 |
- **License:** Apache-2.0
|
38 |
- **Parameter count:** 1.52B
|
39 |
- **Tokenizer:** RWKV World tokenizer
|
|
|
44 |
<!-- Provide the basic links for the model. -->
|
45 |
|
46 |
- **Repository:** https://github.com/fla-org/flash-linear-attention ; https://github.com/BlinkDL/RWKV-LM
|
47 |
+
- **Paper:** [https://huggingface.co/papers/2503.14456](https://huggingface.co/papers/2503.14456)
|
48 |
|
49 |
## Uses
|
50 |
|