Model Card for AICrossSim/bitflip-clm-600m

A 600M parameter bitflip-aware language model trained on 22 * 600M tokens from FineWeb-Edu dataset.

Model Details

bitflip-aixsim-600M is a transformer-based language model with approximately 600 million parameters (embedding layer params excluded). It uses RMSNorm for normalization and is trained on the FineWeb-Edu dataset.

Training Details

Experiment setup and training logs can be found at wandb run.

Downloads last month
31
Safetensors
Model size
680M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Dataset used to train AICrossSim/clm-600m

Collection including AICrossSim/clm-600m