openllama-7b-base / README.md
itsliupeng's picture
Update README.md
6a8fdb6 verified
|
raw
history blame
656 Bytes
metadata
license: apache-2.0
datasets:
  - bigcode/starcoderdata
  - tiiuae/falcon-refinedweb
language:
  - en

A Reproduction of OpenLLaMA using 128 H100 GPUs in Bfloat16.

The pretrain data consists of Falcon, Starcoder, and the wikipedia, arxiv, books, stackexchange from RedPajama. In total, this encompassed nearly 1 trillion tokens.

The model was trained over a single epoch, incorporating 2000 warm-up steps and a cosine learning rate schedule, starting at 3e-5 with 4M batch size.

image/png