Training procedure
The TSOTSALLM-beta Large Language Model (LLM) is a fine-tuned generative text model based on LLama-2 with 7 billions parameters.
Model Architecture
TSOTSALLM-beta is a transformer model, with the following architecture choices:
- Grouped-Query Attention
- Sliding-Window Attention
- Byte-fallback BPE tokenizer
The TSOTSALLM AI Team
- Jean Petit BIKIM
- Fidel Jiomekong
- Martins Folefack
- Brice
- Downloads last month
- 1
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.