Deploy with vllm or ollama

#5
by chinsoyun - opened

Is there any way to deploy this model on GPU with vllm or ollama?

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment