Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add streaming support for OpenAI-compatible endpoints #1262

Merged
merged 9 commits into from
Apr 18, 2024
Merged

Conversation

cpacker
Copy link
Collaborator

@cpacker cpacker commented Apr 16, 2024

Please describe the purpose of this pull request

  • Adds streaming support for OpenAI and OpenAI-compatible backends
    • "OpenAI-compatible" == OpenAI proxy with full tool-calling + streaming
  • Note: Adds package httpx-sse for the SSE components.

TODOs for future PRs

  • Refactor StreamingRefreshInterface to be stateful (real init / no globals), which will require refactoring code that imports CLIInterface (main.py and cli.py?)
  • Put --stream inside the configuration file
  • Migrate stream == True to be the default once it's been tested in a tagged release

How to test

memgpt run --stream

Have you tested this PR?

See video:
screencast

Related issues or PRs

Closes #1215
Closes #345

@cpacker cpacker mentioned this pull request Apr 16, 2024
32 tasks
@cpacker cpacker merged commit aeb4a94 into main Apr 18, 2024
4 checks passed
@cpacker cpacker deleted the streaming branch April 18, 2024 06:40
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Token-level streaming support Streaming support?
1 participant