Transformers
mpt
Composer
MosaicML
llm-foundry

Complete prompt template

#4
by Nightcall - opened

Hi, thank you very much for your quantizations. In the prompt template example you forgot to include the assistant part of the template:
<|im_start|>assistant
Assistant response<|im_end|>

Dataset uses ChatML format: https://www.mosaicml.com/blog/mpt-30b#:~:text=The%20dataset%20uses%20the%20ChatML%20format%2C
ChatML: https://github.com/openai/openai-python/blob/main/chatml.md

Example:
<|im_start|>system
You are ChatGPT, a large language model trained by OpenAI. Answer as concisely as possible.
Knowledge cutoff: 2021-09-01
Current date: 2023-03-01<|im_end|>
<|im_start|>user
How are you<|im_end|>
<|im_start|>assistant
I am doing well!<|im_end|>
<|im_start|>user
How are you now?<|im_end|>

Hi, thank you very much for your quantizations. In the prompt template example you forgot to include the assistant part of the template:
<|im_start|>assistant
Assistant response<|im_end|>

Dataset uses ChatML format: https://www.mosaicml.com/blog/mpt-30b#:~:text=The%20dataset%20uses%20the%20ChatML%20format%2C
ChatML: https://github.com/openai/openai-python/blob/main/chatml.md

Example:
<|im_start|>system
You are ChatGPT, a large language model trained by OpenAI. Answer as concisely as possible.
Knowledge cutoff: 2021-09-01
Current date: 2023-03-01<|im_end|>
<|im_start|>user
How are you<|im_end|>
<|im_start|>assistant
I am doing well!<|im_end|>
<|im_start|>user
How are you now?<|im_end|>

exactly, and I think it looks like this in the ooba template:

user: "user"
bot:  "assistant"
turn_template: "<|im_start|><|user|>\n<|user-message|><|im_end|>\n<|im_start|><|bot|>\n<|bot-message|><|im_end|>\n"
context: "<|im_start|>system\nA conversation between a user and an LLM-based AI assistant. The assistant gives helpful and honest answers.<|im_end|>"

Thanks guys. So specifically what I missed was just

<|im_start|>assistant

is that right? Because then the LLM would generate its own <|im_end|>?

So is this correct now?

<|im_start|>system
A conversation between a user and an LLM-based AI assistant. The assistant gives helpful and honest answers.<|im_end|>
<|im_start|>user
prompt goes here<|im_end|>
<|im_start|>assistant

So is this correct now?

<|im_start|>system
A conversation between a user and an LLM-based AI assistant. The assistant gives helpful and honest answers.<|im_end|>
<|im_start|>user
prompt goes here<|im_end|>
<|im_start|>assistant

I think this is correct. Refer to the official Demo prompt:

https://huggingface.co/spaces/mosaicml/mpt-30b-chat/blob/main/app.py#L49

Thanks guys. So specifically what I missed was just

<|im_start|>assistant

is that right? Because then the LLM would generate its own <|im_end|>?

So is this correct now?

<|im_start|>system
A conversation between a user and an LLM-based AI assistant. The assistant gives helpful and honest answers.<|im_end|>
<|im_start|>user
prompt goes here<|im_end|>
<|im_start|>assistant

It's correct. I tried it In Koboldcpp Story Mode several times, and it generated the response and its own <|im_end|>

<|im_start|>system
A conversation between a user and an LLM-based AI assistant. The assistant gives helpful and honest answers.<|im_end|>
<|im_start|>user
The radius of the earth.<|im_end|>
<|im_start|>assistant
That is approximately 3,959 miles or 6,371 kilometers.<|im_end|>

Thanks, README is updated

Glad to help.

Sign up or log in to comment