Text Generation
llama
pretrained
llama-3
openllm-france

Model Card

This repository contains checkpoints (splitted for 512 GPUs) in DeepSpeed format for the Lucie-7B model, which was trained using this repository of code based on a fork of Megatron-Deepspeed.

Each checkpoint is in a subbranch (revision), which names specifies the number of training steps. For instance step0400000 corresponds to the checkpoint after 4M training steps.

Those checkpoints are provided so that the model can be retrained from a given point.

Contact

[email protected]

Downloads last month
6
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train OpenLLM-France/Lucie-7B-optimizer-states-512GPU

Collection including OpenLLM-France/Lucie-7B-optimizer-states-512GPU