This is an upscaled/merged version of microsoft/Phi-4-mini-instruct. it has 6B parameters and 56 layers. As it has more layers, it also has an increased capacity for learning, so this model can be fine-tuned to be, very good.

Layers 18-22 are merged from microsoft/Phi-4-mini-reasoning, others are from microsoft/Phi-4-mini-instruct.

Downloads last month
33
Safetensors
Model size
6.25B params
Tensor type
FP16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for Pinkstack/Phi-4-mini-6b-merge