So we can run it with llama.cpp
Also waiting for one
Same here
Waiting
Β· Sign up or log in to comment