Error Message: "{"error":"Input validation error: `inputs` must have less than 1000 tokens. Given: 1254","error_type":"validation"}"
#1
by
makeColabFree
- opened
Hi everyone,
I was wondering why I recieve this error message, since I thought this model would support context lengths of up to 4000 tokens?
Btw, thanks for open sourcing this model! :)
Edit: This actually refers to the meta version.
Please report issues with meta's version in their own repo: https://github.com/facebookresearch/llama/issues/
Otherwise it confuses people using the weights from here.
daryl149
changed discussion status to
closed
I opened an issue as suggested: https://github.com/facebookresearch/llama/issues/450
Do you have any trouble with input sequences longer than 1000 tokens, though?