Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

useChat chunk #3252

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open

useChat chunk #3252

wants to merge 4 commits into from

Conversation

BrianHung
Copy link
Contributor

#3225

Let's clients access chunks that are proceeded from the data stream via useChat onChunk. Useful for cache-ing responses state-fully on the client and then replaying them.

// parrot.ts
export async function POST(req: Request) {
  const { chunks } = await req.json();
  if (chunks) {
    const parts = chunks.map(({ type, value }) => formatStreamPart(type, value));
    const stream = convertArrayToReadableStream(parts)
      .pipeThrough(new TextEncoderStream())
    const response = new Response(stream, {
      headers: {
        contentType: 'text/plain; charset=utf-8',
        dataStreamVersion: 'v1',
      }
    });
    return response;
  }
  // If no chunks, call normal API
}

@lgrammel
Copy link
Collaborator

My thoughts:
a) not sure if we want to expose all chunk types like this
b) would need to be implemented in svelte, vue, solid (but good to hold off until a) is clear)

@BrianHung
Copy link
Contributor Author

not sure if we want to expose all chunk types like this

What’s an alternative if we want to store chunks themselves? Is there a higher level API that’ll be more immune to changes if stream parts were changed?

@lgrammel
Copy link
Collaborator

lgrammel commented Oct 21, 2024

@BrianHung the stream parts are standardized in the stream data protocol (but additions might happen). you could also record using a fetch interceptor for now (just write the stream parts as string to disk and then read them as strings and send them back).

@BrianHung
Copy link
Contributor Author

BrianHung commented Oct 21, 2024

Realized onResponse is a way to implement this too:

import { readDataStream } from 'ai';

// Passed into `useChat`
onResponse: async (resp) => {
	const onChunk = async () => {
		const response = resp.clone();
		if (!response.body) return;
		const reader = response.body.getReader()
		for await (const chunk of readDataStream(reader)) {
			console.log("onChunk", { chunk });
		}
	}
	onChunk();
},

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants