You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The error below occurs in the chat window.
If you use the /clear command, the chat window is cleared, but the context still seems to be maintained.
We need a function that can initialize the context.
openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens, however you requested 6779 tokens (6523 in your prompt; 256 for the completion). Please reduce your prompt; or completion length.
Proposed Solution
add /clear-context command in chat pannel
Additional context
The text was updated successfully, but these errors were encountered:
Thank you for opening your first issue in this project! Engagement like this is essential for open source projects! 🤗
If you haven't done so already, check out Jupyter's Code of Conduct. Also, please try to follow the issue template as it helps other other community members to contribute more effectively.
You can meet the other Jovyans by joining our Discourse forum. There is also an intro thread there where you can stop by and say Hi! 👋
@messiah1030k Thank you for reporting this issue! 🤗 Yes, this is certainly a bug of /clear, and not an intentional feature. I noticed it as well while developing #141 yesterday. Don't worry, I'm already working on fixing it.
Problem
The error below occurs in the chat window.
If you use the /clear command, the chat window is cleared, but the context still seems to be maintained.
We need a function that can initialize the context.
openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens, however you requested 6779 tokens (6523 in your prompt; 256 for the completion). Please reduce your prompt; or completion length.
Proposed Solution
add /clear-context command in chat pannel
Additional context
The text was updated successfully, but these errors were encountered: