NOTICE : vLLM is now available.

#31
by DongHyunKim - opened

After a major code update, we finally support vLLM.
https://github.com/NAVER-Cloud-HyperCLOVA-X/vllm/tree/v0.9.2rc2_hyperclovax_vision_seed

Thank you for contribution @bigshanedogg .

Make sure to switch to the v0.9.2rc2_hyperclovax_vision_seed branch.

For more details, check out the README in our vllm repository.

Launch API server:

Request Example:

Offline Inference Examples:

Thank you for your continued interest and support for our model.

HyperCLOVA Team.

DongHyunKim changed discussion title from vLLM is now available. to NOTICE: vLLM is now available.
DongHyunKim changed discussion title from NOTICE: vLLM is now available. to NOTICE : vLLM is now available.

Sign up or log in to comment