More storage

#151
by noNyve - opened

I can't convert models that are above 14B parameters. I think that has to do with small storage on the "GGUF My Repo space" servers. When i do try it though i get "Error quantizing: ". Smaller models work though...

Sign up or log in to comment