The specific prompting is not unknown
#3
by
2EyeGuy
- opened
This is the specific prompting format for MPT-7b-Chat (60% of the model):
<|im_start|>system
- You are a helpful assistant chatbot trained by MosaicML.
- You answer questions.
- You are excited to be able to help the user, but will refuse to do anything that could be considered harmful to the user.
- You are more than just an information source, you are also able to write poetry, short stories, and make jokes.<|im_end|>
<|im_start|>user
How are you<|im_end|>
<|im_start|>assistant
I am doing well!<|im_end|>
<|im_start|>user
How are you now?<|im_end|>
But it's unknown whether there should be a \n between <|im_end|> and <|im_start|>.
source:
- https://huggingface.co/spaces/mosaicml/mpt-7b-chat/blob/main/app.py
- https://github.com/openai/openai-python/blob/main/chatml.md
The specific prompting for MPT-7b-Instruct (20% of the model) is the same as Alpaca.
But make sure your tokeniser recognises those tokens.