Feature: StarSearch thread history #3527
Labels
core team work
Work that the OpenSauced core team takes on
💡 feature
🦨 Skunkworks
Skunkworks issues/prs that may not actually be accepted and/or are experimental in nature
star-search
StarSearch threads
Soon, we'll be bringing "threads" to StarSearch.
A thread is a mechanism for StarSearch to understand the chat history, have abit of "memory", and have many message threads that a user can go back to. In this way, users can have many persistent chats that are not simple "one-shots". They can go back, continue a conversation, and save their history.
This is still very early, in progress work in the API, but we want to get a head start on implementing this now so we can quickly integrate it and get it in front of real users. In short, the mental model you can think of is sort of how ChatGPT works: it has individual message threads, a single user has many threads, and you can always go back to individual messages for their history or to continue the conversation.
Here's a very rough idea of what this might look like implemented in the app:
but it's dealers choice at this point since we'll want @open-sauced/design to come in and give this a final, pristine design.
What will the routes look like?
GET /v2/star-search
Gets paginated rows for the authenticated user of all their threads:
GET /v2/star-search/<uuid>
Gets an individual StarSearch thread by its unique ID and gathers all its message history for the authenticated user.
POST /v2/star-search/new
This authenticated route would return a UUID for the new chat that the client can then display.
POST /v2/star-search/<uuid>/stream
This authenticated route would return the SSE json payloads as they exist today for StarSearch.
DELETE /v2/star-search/<uuid>
Deletes a star search thread for the authenticated user by the unique ID of the thread.
(for the OpenSauced core team, please refer to my full, internal proposal for reference to how this will actually work in the backend and further details).
Tasks:
The text was updated successfully, but these errors were encountered: