NeoZ123 commited on
Commit
5b8b6b8
Β·
verified Β·
1 Parent(s): d0a9d87

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -14,7 +14,7 @@ pipeline_tag: text-generation
14
  # LongReward-glm4-9b-SFT
15
 
16
  <p align="center">
17
- πŸ€— <a href="https://huggingface.co/datasets/THUDM/LongReward-10k" target="_blank">[LongReward Dataset] </a> β€’ πŸ’» <a href="https://github.com/THUDM/LongReward" target="_blank">[Github Repo]</a> β€’ πŸ“ƒ <a href="https://arxiv.org/abs/" target="_blank">[LongReward Paper]</a>
18
  </p>
19
 
20
  LongReward-glm4-9b-SFT is supervisedly fined-tuned from [glm-4-9b](https://huggingface.co/THUDM/glm-4-9b) using the `sft` split of [LongReward-10k](https://huggingface.co/datasets/THUDM/LongReward-45) dataset, and supports a maximum context window of up to 64K tokens.
@@ -85,7 +85,7 @@ If you find our work useful, please consider citing LongReward:
85
  title = {LongReward: Improving Long-context Large Language Models
86
  with AI Feedback}
87
  author={Jiajie Zhang and Zhongni Hou and Xin Lv and Shulin Cao and Zhenyu Hou and Yilin Niu and Lei Hou and Lei Hou and Yuxiao Dong and Ling Feng and Juanzi Li},
88
- journal={arXiv preprint arXiv:},
89
  year={2024}
90
  }
91
  ```
 
14
  # LongReward-glm4-9b-SFT
15
 
16
  <p align="center">
17
+ πŸ€— <a href="https://huggingface.co/datasets/THUDM/LongReward-10k" target="_blank">[LongReward Dataset] </a> β€’ πŸ’» <a href="https://github.com/THUDM/LongReward" target="_blank">[Github Repo]</a> β€’ πŸ“ƒ <a href="https://arxiv.org/abs/2410.21252" target="_blank">[LongReward Paper]</a>
18
  </p>
19
 
20
  LongReward-glm4-9b-SFT is supervisedly fined-tuned from [glm-4-9b](https://huggingface.co/THUDM/glm-4-9b) using the `sft` split of [LongReward-10k](https://huggingface.co/datasets/THUDM/LongReward-45) dataset, and supports a maximum context window of up to 64K tokens.
 
85
  title = {LongReward: Improving Long-context Large Language Models
86
  with AI Feedback}
87
  author={Jiajie Zhang and Zhongni Hou and Xin Lv and Shulin Cao and Zhenyu Hou and Yilin Niu and Lei Hou and Lei Hou and Yuxiao Dong and Ling Feng and Juanzi Li},
88
+ journal={arXiv preprint arXiv:2410.21252},
89
  year={2024}
90
  }
91
  ```