nielsr HF Staff commited on
Commit
0f66460
·
verified ·
1 Parent(s): a66ed6f

Improve model card: Add library_name and GitHub link

Browse files

This PR improves the model card by:
- Adding `library_name: transformers` to the metadata, which enables the "Use in Transformers" widget on the Hub and improves model discoverability.
- Adding a direct link to the associated GitHub repository for easier access to the code and related project resources.

Files changed (1) hide show
  1. README.md +6 -2
README.md CHANGED
@@ -1,5 +1,7 @@
1
  ---
2
  license: apache-2.0
 
 
3
  tags:
4
  - transformer
5
  - causal-lm
@@ -7,11 +9,12 @@ tags:
7
  - constructive-learning
8
  - frozen-embeddings
9
  - bvv
10
- pipeline_tag: text-generation
11
  ---
12
 
13
  # Model Card for abs-bvv-4
14
 
 
 
15
  ## Model Description
16
 
17
  `abs-bvv-4` is a 1.9 billion parameter decoder-only Transformer model. It is the 4th model in the **Progressive Growth Transformers (PGT)** series, designed to explore how linguistic and reasoning capabilities emerge as a function of model depth.
@@ -103,4 +106,5 @@ outputs = model.generate(
103
  do_sample=True
104
  )
105
 
106
- print(tokenizer.decode(outputs[0], skip_special_tokens=True))
 
 
1
  ---
2
  license: apache-2.0
3
+ pipeline_tag: text-generation
4
+ library_name: transformers
5
  tags:
6
  - transformer
7
  - causal-lm
 
9
  - constructive-learning
10
  - frozen-embeddings
11
  - bvv
 
12
  ---
13
 
14
  # Model Card for abs-bvv-4
15
 
16
+ **GitHub Repository**: [https://github.com/Bochkov/BVV241-Tokenizers-Embeddings-Benchmarks](https://github.com/Bochkov/BVV241-Tokenizers-Embeddings-Benchmarks)
17
+
18
  ## Model Description
19
 
20
  `abs-bvv-4` is a 1.9 billion parameter decoder-only Transformer model. It is the 4th model in the **Progressive Growth Transformers (PGT)** series, designed to explore how linguistic and reasoning capabilities emerge as a function of model depth.
 
106
  do_sample=True
107
  )
108
 
109
+ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
110
+ ```