manaestras commited on
Commit
2c5dc93
·
verified ·
1 Parent(s): 4e8d410

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +14 -13
README.md CHANGED
@@ -1,3 +1,6 @@
 
 
 
1
 
2
  <p align="left">
3
  <a href="README_CN.md">中文</a>&nbsp | English</a>
@@ -10,21 +13,19 @@
10
 
11
 
12
  <p align="center">
13
- 🤗&nbsp;<a href="https://huggingface.co/tencent/"><b>Hugging Face</b></a>&nbsp;&nbsp;|&nbsp;&nbsp;
14
- <img src="https://avatars.githubusercontent.com/u/109945100?s=200&v=4" width="16"/>&nbsp;<a href="https://modelscope.cn/models/Tencent-Hunyuan/Hunyuan-A13B-Instruct"><b>ModelScope</b></a>&nbsp;&nbsp;|&nbsp;&nbsp;
15
- <img src="https://cdn-avatars.huggingface.co/v1/production/uploads/6594d0c6c5f1cd69a48b261d/04ZNQlAfs08Bfg4B1o3XO.png" width="14"/>&nbsp;<a href="https://github.com/Tencent/AngelSlim/tree/main"><b>AngelSlim</b></a>
16
- </p>
17
-
18
- <p align="center">
19
- 🖥️&nbsp;<a href="https://hunyuan.tencent.com" style="color: red;"><b>Official Website</b></a>&nbsp;&nbsp;|&nbsp;&nbsp;
20
  🕖&nbsp;<a href="https://cloud.tencent.com/product/hunyuan"><b>HunyuanAPI</b></a>&nbsp;&nbsp;|&nbsp;&nbsp;
21
- 🕹️&nbsp;<a href="https://hunyuan.tencent.com/"><b>Demo</b></a>&nbsp;&nbsp;&nbsp;&nbsp;
 
22
  </p>
23
 
 
24
  <p align="center">
25
- <a href="https://github.com/Tencent-Hunyuan/Hunyuan-7B"><b>GITHUB</b></a> |
26
- <a href="https://cnb.cool/tencent/hunyuan/Hunyuan-7B"><b>cnb.cool</b></a> |
27
- <a href="https://github.com/Tencent-Hunyuan/Hunyuan-7B/blob/main/LICENSE"><b>LICENSE</b></a>
 
28
  </p>
29
 
30
 
@@ -42,7 +43,7 @@ We have released a series of Hunyuan dense models, comprising both pre-trained a
42
  - **Efficient Inference**: Utilizes Grouped Query Attention (GQA) and supports multiple quantization formats, enabling highly efficient inference.
43
 
44
  ## Related News
45
- * 2025.7.30 We have open-sourced **Hunyuan-0.5B-Pretrain** , **Hunyuan-1.8B-Pretrain** , **Hunyuan-4B-Pretrain** , **Hunyuan-7B-Pretrain** , **Hunyuan-0.5B-Instruct** , **Hunyuan-1.8B-Instruct** , **Hunyuan-4B-Instruct** , **Hunyuan-7B-Instruct** on Hugging Face.
46
  <br>
47
 
48
 
@@ -500,4 +501,4 @@ docker run --entrypoint="python3" --gpus all \
500
 
501
  ## Contact Us
502
 
503
- If you would like to leave a message for our R&D and product teams, Welcome to contact our open-source team . You can also contact us via email ([email protected]).
 
1
+ ---
2
+ library_name: transformers
3
+ ---
4
 
5
  <p align="left">
6
  <a href="README_CN.md">中文</a>&nbsp | English</a>
 
13
 
14
 
15
  <p align="center">
16
+ 🤗&nbsp;<a href="https://huggingface.co/tencent/Hunyuan-1.8B-Pretrain"><b>Hugging Face</b></a>&nbsp;&nbsp;|&nbsp;&nbsp;
17
+ 🖥️&nbsp;<a href="https://hunyuan.tencent.com" style="color: red;"><b>Official Website</b></a>&nbsp;&nbsp;|&nbsp;&nbsp;
 
 
 
 
 
18
  🕖&nbsp;<a href="https://cloud.tencent.com/product/hunyuan"><b>HunyuanAPI</b></a>&nbsp;&nbsp;|&nbsp;&nbsp;
19
+ 🕹️&nbsp;<a href="https://hunyuan.tencent.com/"><b>Demo</b></a>&nbsp;&nbsp;|&nbsp;&nbsp;
20
+ 🤖&nbsp;<a href="https://www.modelscope.cn/models/Tencent-Hunyuan/Hunyuan-1.8B-Pretrain"><b>ModelScope</b></a>
21
  </p>
22
 
23
+
24
  <p align="center">
25
+ <a href="https://github.com/Tencent-Hunyuan/Hunyuan-1.8B"><b>GITHUB</b></a> |
26
+ <a href="https://github.com/Tencent-Hunyuan/Hunyuan-1.8B/blob/main/LICENSE.txt"><b>LICENSE</b></a> |
27
+ <a href="https://raw.githubusercontent.com/Tencent-Hunyuan/Hunyuan-A13B/main/assets/1751881231452.jpg"><b>WeChat</b></a> |
28
+ <a href="https://discord.gg/bsPcMEtV7v"><b>Discord</b></a>
29
  </p>
30
 
31
 
 
43
  - **Efficient Inference**: Utilizes Grouped Query Attention (GQA) and supports multiple quantization formats, enabling highly efficient inference.
44
 
45
  ## Related News
46
+ * 2025.7.30 We have open-sourced **Hunyuan-0.5B-Pretrain** , **Hunyuan-0.5B-Instruct** , **Hunyuan-1.8B-Pretrain** , **Hunyuan-1.8B-Instruct** , **Hunyuan-4B-Pretrain** , **Hunyuan-4B-Instruct** , **Hunyuan-7B-Pretrain** ,**Hunyuan-7B-Instruct** on Hugging Face.
47
  <br>
48
 
49
 
 
501
 
502
  ## Contact Us
503
 
504
+ If you would like to leave a message for our R&D and product teams, Welcome to contact our open-source team . You can also contact us via email ([email protected]).