onekq-ai 's Collections

Ollama-ready Coding Models

For inference. CPU is enough for both quantization and inference.