Sleeping
PHI2 SFT OASST1
📈
This model is fine tuned on PHI-2 model with OASST1 dataset
This model is fine tuned on PHI-2 model with OASST1 dataset
Generate text based on user prompts
This is a first implementation on transformer
Tokenizer specific to odia language with 5000 tokens
Trained on ImageNet1k