Safetensors
English
gpt2

GPT2 trained on the BabyLM 2024 training set (in IPA) using a BPE tokenizer with word boundaries removed.

Model trained for From Babble to Words: Pre-Training Language Models on Continuous Streams of Phonemes.

Downloads last month
4
Safetensors
Model size
97.5M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for phonemetransformers/GPT2-85M-BPE-PHON-SPACELESS

Finetuned
(1524)
this model

Dataset used to train phonemetransformers/GPT2-85M-BPE-PHON-SPACELESS

Collection including phonemetransformers/GPT2-85M-BPE-PHON-SPACELESS