A Llama Chat Model of 68M Parameters

Recommended Prompt Format

<|im_start|>system
{system_message}<|im_end|>
<|im_start|>user
{user_message}<|im_end|>
<|im_start|>assistant

Recommended Inference Parameters

penalty_alpha: 0.5
top_k: 4

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 29.72
AI2 Reasoning Challenge (25-Shot) 23.29
HellaSwag (10-Shot) 28.27
MMLU (5-Shot) 25.18
TruthfulQA (0-shot) 47.27
Winogrande (5-shot) 54.30
GSM8k (5-shot) 0.00
Downloads last month
1,451
Safetensors
Model size
68M params
Tensor type
F32
Β·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Felladrin/Llama-68M-Chat-v1

Base model

JackFram/llama-68m
Finetuned
(16)
this model
Quantizations
5 models

Datasets used to train Felladrin/Llama-68M-Chat-v1

Spaces using Felladrin/Llama-68M-Chat-v1 3

Collection including Felladrin/Llama-68M-Chat-v1

Evaluation results