Add link to paper

#3
by nielsr HF staff - opened
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -15,6 +15,8 @@ library_name: transformers
15
  <img alt="Chat" src="https://img.shields.io/badge/%F0%9F%92%9C%EF%B8%8F%20Qwen%20Chat%20-536af5" style="display: inline-block; vertical-align: middle;"/>
16
  </a>
17
 
 
 
18
  ## Introduction
19
 
20
  Qwen2.5-1M is the long-context version of the Qwen2.5 series models, supporting a context length of up to 1M tokens. Compared to the Qwen2.5 128K version, Qwen2.5-1M demonstrates significantly improved performance in handling long-context tasks while maintaining its capability in short tasks.
 
15
  <img alt="Chat" src="https://img.shields.io/badge/%F0%9F%92%9C%EF%B8%8F%20Qwen%20Chat%20-536af5" style="display: inline-block; vertical-align: middle;"/>
16
  </a>
17
 
18
+ This repository contains the model of the paper [Qwen2.5-1M Technical Report](https://huggingface.co/papers/2501.15383).
19
+
20
  ## Introduction
21
 
22
  Qwen2.5-1M is the long-context version of the Qwen2.5 series models, supporting a context length of up to 1M tokens. Compared to the Qwen2.5 128K version, Qwen2.5-1M demonstrates significantly improved performance in handling long-context tasks while maintaining its capability in short tasks.