You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
We are thinking of using AutoGen for our multi-turn chat application, but we need message limiting based on token count, to avoid going over the context window. I currently use a package that I maintain for that, at https://github.com/pamelafox/openai-messages-token-helper
My package accounts for messages from any GPT model, function calling, and image requests with vision models. I don't believe MessageHistoryLimiter handles function calls and image requests, from what i saw of the code.
Describe the solution you'd like
You could bring in the logic from my package into MessageHistoryLimiter. My logic isn't perfect, since it's based on devs reverse-engineering the OpenAI token construction, but it gets within 4 tokens in all my tests. My repo also includes unit tests and live tests.
Additional context
No response
The text was updated successfully, but these errors were encountered:
Is your feature request related to a problem? Please describe.
We are thinking of using AutoGen for our multi-turn chat application, but we need message limiting based on token count, to avoid going over the context window. I currently use a package that I maintain for that, at https://github.com/pamelafox/openai-messages-token-helper
My package accounts for messages from any GPT model, function calling, and image requests with vision models. I don't believe MessageHistoryLimiter handles function calls and image requests, from what i saw of the code.
Describe the solution you'd like
You could bring in the logic from my package into MessageHistoryLimiter. My logic isn't perfect, since it's based on devs reverse-engineering the OpenAI token construction, but it gets within 4 tokens in all my tests. My repo also includes unit tests and live tests.
Additional context
No response
The text was updated successfully, but these errors were encountered: