nielsr HF Staff commited on
Commit
f33c951
·
verified ·
1 Parent(s): 4d08491

Add pipeline tag, remove dummy blog url

Browse files

This PR improves the model card, by:

- making sure the model can be found at https://huggingface.co/models?pipeline_tag=text-generation&sort=trending
- removes a link to a non-existent blog post

Files changed (1) hide show
  1. README.md +5 -4
README.md CHANGED
@@ -1,14 +1,15 @@
1
  ---
2
- thumbnail: https://cdn-uploads.huggingface.co/production/uploads/633e85093a17ab61de8d9073/dM-i7n313mUnY-fbmElVM.png
3
  license: other
4
  license_name: tongyi-qianwen
5
- library_name: transformers
 
6
  ---
7
 
8
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/633e85093a17ab61de8d9073/dM-i7n313mUnY-fbmElVM.png)
9
 
10
  - Try out the model on [![Featherless](https://img.shields.io/badge/featherless--ai%2FQwerky--72B-Dummy?style=flat&label=Featherless&color=facc15)](https://featherless.ai/models/featherless-ai/Qwerky-72B)
11
- - Model details from our blog post here! [![Substack](https://img.shields.io/badge/Substack-Dummy?style=flat&color=facc15)](https://substack.recursal.ai/p/qwerky-72b-and-32b-training-large)
12
 
13
  Benchmarks is as follows for both Qwerky-QwQ-32B and Qwerky-72B models:
14
 
@@ -82,4 +83,4 @@ As demonstrated with our Qwerky-72B-Preview and prior models such as QRWKV6-32B
82
 
83
  As with our previous models, the model's inherent knowledge and dataset training are inherited from its "parent" model. Consequently, unlike previous RWKV models trained on over 100+ languages, the QRWKV model is limited to approximately 30 languages supported by the Qwen line of models.
84
 
85
- You may find our details of the process from our previous release, [here](https://huggingface.co/recursal/QRWKV6-32B-Instruct-Preview-v0.1).
 
1
  ---
2
+ library_name: transformers
3
  license: other
4
  license_name: tongyi-qianwen
5
+ thumbnail: https://cdn-uploads.huggingface.co/production/uploads/633e85093a17ab61de8d9073/dM-i7n313mUnY-fbmElVM.png
6
+ pipeline_tag: text-generation
7
  ---
8
 
9
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/633e85093a17ab61de8d9073/dM-i7n313mUnY-fbmElVM.png)
10
 
11
  - Try out the model on [![Featherless](https://img.shields.io/badge/featherless--ai%2FQwerky--72B-Dummy?style=flat&label=Featherless&color=facc15)](https://featherless.ai/models/featherless-ai/Qwerky-72B)
12
+ - Model details from our blog post here!
13
 
14
  Benchmarks is as follows for both Qwerky-QwQ-32B and Qwerky-72B models:
15
 
 
83
 
84
  As with our previous models, the model's inherent knowledge and dataset training are inherited from its "parent" model. Consequently, unlike previous RWKV models trained on over 100+ languages, the QRWKV model is limited to approximately 30 languages supported by the Qwen line of models.
85
 
86
+ You may find our details of the process from our previous release, [here](https://huggingface.co/recursal/QRWKV6-32B-Instruct-Preview-v0.1).