Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,9 @@
|
|
1 |
-
---
|
2 |
-
license: apache-2.0
|
3 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
---
|
4 |
+
|
5 |
+
This dataset contains the logs of models trained for Open-sci-ref 0.01.
|
6 |
+
|
7 |
+
Open-sci-ref 0.01 is a research dense transformer model family that includes all the intermediate checkpoints trained on 8 different reference open datasets (C4, Pile, SlimPajama, FineWeb-Edu-1.4T (v1.0.0), DCLM-baseline, Nemotron-CC-HQ, HPLT-2.0 (english subset), and CommonCorpus) on various model (0.13B - 0.4B - 1.3B - 1.7B) and token (50B, 300B, 1T) scales. It is suppposed to serve as baselines for comparison and for studies on training dynamics. All artifacts are released under permissive Apache 2.0 licence.
|
8 |
+
|
9 |
+
See the Open-sci-ref 0.01 research release [blog post](https://laion.ai/blog/open-sci-ref-001/) and [github](https://github.com/LAION-AI/open-sci-ref-0.01) for more details.
|