loaded succesfully but no response from vllm
#2
by
Fernanda24
- opened
i loaded Air-FP8 in vllm (APIServer pid=945236) INFO 08-03 01:16:48 [api_server.py:1846] Starting vLLM API server 0 on http://0.0.0.0:8001 ... [chat_utils.py:468] Detected the chat template content format to be 'openai'. You can set `--chat-template-content-format` to override this. (APIServer pid=945236) INFO: 127.0.0.1:38000 - "POST /v1/chat/completions HTTP/1.1" 200 OK
it doesn't send response back to openwebui. It seems to recieve the request but nothing comes back. In open webui i see only SyntaxError: JSON.parse: unexpected character at line 1 column 1 of the JSON data