context length confusion

#273
by elainexu - opened

I think the context length for this model is 128k as demonstrated in the documentation, but when I use the client.chat.completions.create() function in the OpenAI package in Python, it still produces errors like:

Error code: 400 - {'message': 'Please reduce the length of the messages or completion. Current length is 14966 while limit is 8192', 'type': 'invalid_request_error', 'param': 'messages', 'code': 'context_length_exceeded'}

What can I do to extend the context window? Thanks!

Sign up or log in to comment