Update README.md
Browse files
README.md
CHANGED
@@ -56,7 +56,7 @@ MERaLiON stands for **M**ultimodal **E**mpathetic **R**easoning **a**nd **L**ear
|
|
56 |
- **License:** [MERaLiON Public License](https://huggingface.co/MERaLiON/MERaLiON-AudioLLM-Whisper-SEA-LION/blob/main/MERaLiON-Public-Licence-v1.pdf)
|
57 |
- **Demo:** [MERaLiON-AudioLLM Web Demo](https://huggingface.co/spaces/MERaLiON/MERaLiON-AudioLLM)
|
58 |
|
59 |
-
We support model inference using the [Huggingface](#inference) and [vLLM](vllm_plugin_meralion/README.md) frameworks. For more technical details, please refer to our [technical report](https://arxiv.org/abs/2412.09818).
|
60 |
|
61 |
## Acknowledgement
|
62 |
This research is supported by the National Research Foundation, Singapore and Infocomm Media Development Authority, Singapore under its National Large Language Models Funding Initiative.
|
|
|
56 |
- **License:** [MERaLiON Public License](https://huggingface.co/MERaLiON/MERaLiON-AudioLLM-Whisper-SEA-LION/blob/main/MERaLiON-Public-Licence-v1.pdf)
|
57 |
- **Demo:** [MERaLiON-AudioLLM Web Demo](https://huggingface.co/spaces/MERaLiON/MERaLiON-AudioLLM)
|
58 |
|
59 |
+
We support model inference using the [Huggingface](#inference) and [vLLM](vllm_plugin_meralion/README.md) frameworks, allowing lightning [inference speed](https://huggingface.co/MERaLiON/MERaLiON-AudioLLM-Whisper-SEA-LION/blob/main/vllm_plugin_meralion/README.md#inference-performance-benchmark). For more technical details, please refer to our [technical report](https://arxiv.org/abs/2412.09818).
|
60 |
|
61 |
## Acknowledgement
|
62 |
This research is supported by the National Research Foundation, Singapore and Infocomm Media Development Authority, Singapore under its National Large Language Models Funding Initiative.
|