This is a 230M parameter Small Llama model distilled from the Original one. The model is distilled on OpenOrca's FLAN dataset. The distillation ran over 160000 random samples of FLAN dataset. It is free to download. Also, it is a work in progress, so please use it at your own risk

Downloads last month
1,027
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Dataset used to train Sayan01/Llama-Flan-XL2base