CharLLama-1.3B Pretrained Language Model

This language model is similar to CharLLama-2.6B in terms of training process and usage, but with a smaller capacity. It has 1,369,634,304 parameters.

For detailed information, including pretraining data, tokenization, and usage examples, please refer to the CharLLama-2.6B documentation.

Downloads last month
20
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support