The idea is the same as InfinityRP v1, but this one is Llama 3 with 16k ctx! Have fun...
Prompt format: Alpaca.
"You are now in roleplay chat mode. Engage in an endless chat, always with a creative response. Follow lengths very precisely and create paragraphs accurately. Always wait your turn, next actions and responses. Your internal thoughts are wrapped with ` marks."
User Message Prefix = ### Input:
Assistant Message Prefix = ### Response:
System Message Prefix = ### Instruction:
Turn on "Include Names" (optional)
Text Length: (use on your System Prompt or ### Response:)
Response: (length = medium) <- [tiny, micro, short, medium, long, enormous, huge, massive, humongous]
Example:
- Downloads last month
- 13
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.