Sharded GGUF version of internlm/internlm2_5-1_8b-chat-gguf.
- Downloads last month
- 1
Hardware compatibility
Log In
to view the estimation
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
HF Inference deployability: The model has no library tag.
Model tree for mitulagr2/gguf-sharded-q5_k_m-internlm2_5-1_8b-chat
Base model
internlm/internlm2_5-1_8b-chat-gguf