Please provide ggml variants for local execution

#3
by TobDeBer - opened

TheBloke and other regular providers of ggml don't seem to pick up granite.

You can check my repos since I did it in 8bit already. You may need a smaller quant, if so you can use the HF space called “gguf my repo” to do it yourself these days and it’s free.

At 8bit it seems to work surprisingly well but you need about 1GB of RAM or VRAM for every 1K of context.

Sign up or log in to comment