You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This model's maximum context length is 4096 tokens. However, your messages resulted in 4119 tokens. Please reduce the length of the messages. Sorry, I am not feeling well. Please try again.
After exceeding the limit of tokens, we can only execute /clear to clear conversation context to restore function.
This seems to be the only option, so is it possible that sending the message again after reaching the limit will automatically execute the /clear command?
The text was updated successfully, but these errors were encountered:
I think it can also be possible with embeddings. Another advantage can store days worth of conversations in case you need long term memory and brings the cost of conversations down by 5 times depending on usage. I've been reading on ways to implement it from these sources
I think it can also be possible with embeddings. Another advantage can store days worth of conversations in case you need long term memory and brings the cost of conversations down by 5 times depending on usage. I've been reading on ways to implement it from these sources
This seems to be a much better and elegant solution.
This model's maximum context length is 4096 tokens. However, your messages resulted in 4119 tokens. Please reduce the length of the messages. Sorry, I am not feeling well. Please try again.
After exceeding the limit of tokens, we can only execute /clear to clear conversation context to restore function.
This seems to be the only option, so is it possible that sending the message again after reaching the limit will automatically execute the /clear command?
The text was updated successfully, but these errors were encountered: