Model Description

The model you’re using is based on LilRg/ECE-1B-merge-PRYMMAL. Through specialized fine-tuning, this model has been trained to become highly proficient in solving complex problems. By using a dataset specifically focused on instructions (mosaicml/instruct-v3), it has gained the ability to handle advanced reasoning.

  • Developed by: Youri Lalain (@Youlln)
  • Organization: ECE engineering school

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 11.80
IFEval (0-Shot) 21.44
BBH (3-Shot) 16.19
MATH Lvl 5 (4-Shot) 6.12
GPQA (0-shot) 3.80
MuSR (0-shot) 3.87
MMLU-PRO (5-shot) 19.36
Downloads last month
90
Safetensors
Model size
1.54B params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Youlln/ECE-PRYMMAL1B-FT-V1

Finetuned
(1)
this model
Quantizations
1 model

Dataset used to train Youlln/ECE-PRYMMAL1B-FT-V1

Evaluation results