DeepSeek-V2.5-GGUF
Original Model
Run with LlamaEdge
- LlamaEdge version: coming soon
Prompt template
Prompt type:
deepseek-chat-25
Prompt string
<|begin_of_sentence|>{system_message}<|User|>{user_message_1}<|Assistant|>{assistant_message_1}<|end_of_sentence|><|User|>{user_message_2}<|Assistant|>
Context size:
128000
Run as LlamaEdge service
wasmedge --dir .:. \ --nn-preload default:GGML:AUTO:DeepSeek-V2.5-Q5_K_M.gguf \ llama-api-server.wasm \ --prompt-template deepseek-chat-25 \ --ctx-size 128000 \ --model-name DeepSeek-V2.5
Run as LlamaEdge command app
wasmedge --dir .:. \ --nn-preload default:GGML:AUTO:DeepSeek-V2.5-Q5_K_M.gguf \ llama-chat.wasm \ --prompt-template deepseek-chat-25 \ --ctx-size 128000
Quatized with llama.cpp b3664
- Downloads last month
- 670
Model tree for second-state/DeepSeek-V2.5-GGUF
Base model
deepseek-ai/DeepSeek-V2.5