Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

📦 Release: v0.0.3-rc1 #192

Merged
merged 13 commits into from
Apr 7, 2024
Merged

📦 Release: v0.0.3-rc1 #192

merged 13 commits into from
Apr 7, 2024

Conversation

roma-glushko
Copy link
Member

@roma-glushko roma-glushko commented Apr 1, 2024

A new release candidate that includes the new streaming chat functionality.

Added

Fixed

Security

Miscellaneous

roma-glushko and others added 12 commits March 3, 2024 00:12
- Initing a new type of workflow, a streaming (async) routing workflow using the Streaming Chat API as an example
- Updated the Bruno collection
- Updated the LanguageModel API to include `ChatStream()` and `SupportChatStream()` methods
- Get the streaming router working
- Implemented SSE event parsing to be able to work with OpenAI streaming chat API
- Integrated OpenAI chat streaming into the Glide's streaming chat API
- Covered the happy workflow by tests
* 🔒 Upgraded the crypto lib
* ⬆️ Upgrade Go to 1.22.1
* 🔒 Fiber to v2.52.2
…tStream (#166)

- Separated sync and streaming chat schemas
- Extracted assumptions on where to find latency from routing strategies to a separate `LatencyGetters` that can be different for different models/workflows
- Elaborated the client provider `chatStream()` interface. Clients now expose a response channel instead of being provided with by caller
- Connected the stream chat workflow to latency & health tracking
- Refined the `chatStream()` method of clients to return a stream struct
- Separated latency tracking of the streaming workflow from the sync chat workflow
- defined a new `HealthTracker` to incorporate all health tracking logic
Improve general coverage of the codebase:
- covered a few configs by tests
- file content expansion in configurations
- Separated chat & chat stream request schemas 
- introduced a new finish reason field 
- added metadata to stream chat response
- allow to attach some metadata to a chat stream request and then attach it to each chat stream chunk
- adjusted error message schema to include request ID and metadata
Handle a wrong API key case to make the model as unavailable permanently
…age (#184)

- Fixed the header where Anthropic API key is passed
- Started propagating token usage of Anthropic requests
- Corrected the TokenUsage interface by changing count field to integers from floats
* #173: add streaming

* #173: update header and test data

* #173: Update test and schema

* #173: lint

---------

Co-authored-by: Max <[email protected]>
* #171: support streaming

* #171: add tests & lint

* #171: update chat.go

* #171: lint

* #171: update test

---------

Co-authored-by: Max <[email protected]>
@roma-glushko roma-glushko self-assigned this Apr 1, 2024
@roma-glushko roma-glushko marked this pull request as draft April 1, 2024 20:39
@roma-glushko roma-glushko marked this pull request as ready for review April 7, 2024 09:21
@roma-glushko roma-glushko merged commit 5e0dd0f into main Apr 7, 2024
16 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants