view post Post 4684 You can now run Llama 4 on your own local device! 🦙Run our Dynamic 1.78-bit and 2.71-bit Llama 4 GGUFs: unsloth/Llama-4-Scout-17B-16E-Instruct-GGUFYou can run them on llama.cpp and other inference engines. See our guide here: https://docs.unsloth.ai/basics/tutorial-how-to-run-and-fine-tune-llama-4 See translation 1 reply · 🤗 14 14 🔥 10 10 ❤️ 6 6 🚀 6 6 + Reply