--- license: mit datasets: - mlabonne/OpenThoughts-79k-filtered language: - en base_model: - microsoft/phi-4 pipeline_tag: text-generation library_name: transformers --- Barcenas 14b phi-4 v2 Based on pankajmathur/orca_mini_phi-4 And trained with the dataset mlabonne/OpenThoughts-79k-filtered The goal of this new model is to work around the bugs of the first version, using a better base and a much larger dataset containing related quality data covering math, science, code and puzzles. This new version is expected to perform much better than the first version and achieve better benchmark results. Made with ❤️ in Guadalupe, Nuevo Leon, Mexico 🇲🇽