Skip to content

Commit

Permalink
Add assistant examples to next pages (#1873)
Browse files Browse the repository at this point in the history
  • Loading branch information
jeremyphilemon authored Jun 6, 2024
1 parent 78f04a7 commit 9fb3b4d
Show file tree
Hide file tree
Showing 4 changed files with 255 additions and 3 deletions.
6 changes: 3 additions & 3 deletions content/examples/01-next-app/06-assistants/index.mdx
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
---
title: Assistants
description: Learn to use the assistant API using the Vercel AI SDK in your Next.js application
title: OpenAI Assistants
description: Learn to use the OpenAI Assistant API using the Vercel AI SDK in your Next.js application
---

# Assistants

In this section, you will learn to use OpenAI's assistant API along with `ai/rsc` functions.
In this section, you will learn to use OpenAI's Assistant API along with `ai/rsc` functions.
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
---
title: Stream Assistant Responses
description: Learn to stream assistant responses using the Vercel AI SDK in your Next.js Pages Router application
---

# Stream Assistant Responses

## Client

Let's create a simple chat interface that allows users to send messages to the assistant and receive responses. You will integrate the `useAssistant` hook from `ai/react` to stream the messages and status.

```tsx filename='pages/index.tsx'
import { Message, useAssistant } from 'ai/react';

export default function Page() {
const { status, messages, input, submitMessage, handleInputChange } =
useAssistant({ api: '/api/assistant' });

return (
<div className="flex flex-col gap-2">
<div className="p-2">status: {status}</div>

<div className="flex flex-col p-2 gap-2">
{messages.map((message: Message) => (
<div key={message.id} className="flex flex-row gap-2">
<div className="w-24 text-zinc-500">{`${message.role}: `}</div>
<div className="w-full">{message.content}</div>
</div>
))}
</div>

<form onSubmit={submitMessage} className="fixed bottom-0 p-2 w-full">
<input
disabled={status !== 'awaiting_message'}
value={input}
onChange={handleInputChange}
className="bg-zinc-100 w-full p-2"
/>
</form>
</div>
);
}
```

## Server

Next, you will create an API route for `api/assistant` to handle the assistant's messages and responses. You will use the `AssistantResponse` function from `ai` to stream the assistant's responses back to the `useAssistant` hook on the client.

```tsx filename='app/api/assistant/route.ts'
import OpenAI from 'openai';
import { AssistantResponse } from 'ai';

const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY || '',
});

export async function POST(req: Request) {
const input: {
threadId: string | null;
message: string;
} = await req.json();

const threadId = input.threadId ?? (await openai.beta.threads.create({})).id;

const createdMessage = await openai.beta.threads.messages.create(threadId, {
role: 'user',
content: input.message,
});

return AssistantResponse(
{ threadId, messageId: createdMessage.id },
async ({ forwardStream }) => {
const runStream = openai.beta.threads.runs.stream(threadId, {
assistant_id:
process.env.ASSISTANT_ID ??
(() => {
throw new Error('ASSISTANT_ID environment is not set');
})(),
});

await forwardStream(runStream);
},
);
}
```
Original file line number Diff line number Diff line change
@@ -0,0 +1,142 @@
---
title: Stream Assistant Responses with Tools
description: Learn to stream assistant responses with tools using the Vercel AI SDK in your Next.js Pages Router application
---

# Stream Assistant Responses with Tools

Let's create a simple chat interface that allows users to send messages to the assistant and receive responses and give it the ability to use tools. You will integrate the `useAssistant` hook from `ai/react` to stream the messages and status.

You will need to provide the list of tools on the OpenAI [Assistant Dashboard](https://platform.openai.com/assistants). You can use the following schema to create a tool to convert celsius to fahrenheit.

```json
{
"name": "celsiusToFahrenheit",
"description": "convert celsius to fahrenheit.",
"parameters": {
"type": "object",
"properties": {
"value": {
"type": "number",
"description": "the value in celsius."
}
},
"required": ["value"]
}
}
```

## Client

Let's create a simple chat interface that allows users to send messages to the assistant and receive responses. You will integrate the `useAssistant` hook from `ai/react` to stream the messages and status.

```tsx filename='pages/index.tsx'
import { Message, useAssistant } from 'ai/react';

export default function Page() {
const { status, messages, input, submitMessage, handleInputChange } =
useAssistant({ api: '/api/assistant' });

return (
<div className="flex flex-col gap-2">
<div className="p-2">status: {status}</div>

<div className="flex flex-col p-2 gap-2">
{messages.map((message: Message) => (
<div key={message.id} className="flex flex-row gap-2">
<div className="w-24 text-zinc-500">{`${message.role}: `}</div>
<div className="w-full">{message.content}</div>
</div>
))}
</div>

<form onSubmit={submitMessage} className="fixed bottom-0 p-2 w-full">
<input
disabled={status !== 'awaiting_message'}
value={input}
onChange={handleInputChange}
className="bg-zinc-100 w-full p-2"
/>
</form>
</div>
);
}
```

## Server

Next, you will create an API route for `api/assistant` to handle the assistant's messages and responses. You will use the `AssistantResponse` function from `ai` to stream the assistant's responses back to the `useAssistant` hook on the client.

```tsx filename='app/api/assistant/route.ts'
import { AssistantResponse } from 'ai';
import OpenAI from 'openai';

const openai = new OpenAI({
apiKey: process.env.OPENAI_API_KEY || '',
});

export async function POST(req: Request) {
const input: {
threadId: string | null;
message: string;
} = await req.json();

const threadId = input.threadId ?? (await openai.beta.threads.create({})).id;

const createdMessage = await openai.beta.threads.messages.create(threadId, {
role: 'user',
content: input.message,
});

return AssistantResponse(
{ threadId, messageId: createdMessage.id },
async ({ forwardStream }) => {
const runStream = openai.beta.threads.runs.stream(threadId, {
assistant_id:
process.env.ASSISTANT_ID ??
(() => {
throw new Error('ASSISTANT_ID is not set');
})(),
});

let runResult = await forwardStream(runStream);

while (
runResult?.status === 'requires_action' &&
runResult.required_action?.type === 'submit_tool_outputs'
) {
const tool_outputs =
runResult.required_action.submit_tool_outputs.tool_calls.map(
(toolCall: any) => {
const parameters = JSON.parse(toolCall.function.arguments);

switch (toolCall.function.name) {
case 'celsiusToFahrenheit':
const celsius = parseFloat(parameters.value);
const fahrenheit = celsius * (9 / 5) + 32;

return {
tool_call_id: toolCall.id,
output: `${celsius}°C is ${fahrenheit.toFixed(2)}°F`,
};

default:
throw new Error(
`Unknown tool call function: ${toolCall.function.name}`,
);
}
},
);

runResult = await forwardStream(
openai.beta.threads.runs.submitToolOutputsStream(
threadId,
runResult.id,
{ tool_outputs },
),
);
}
},
);
}
```
25 changes: 25 additions & 0 deletions content/examples/02-next-pages/15-assistants/index.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
---
title: OpenAI Assistants
description: Learn to use the OpenAI Assistant API using the Vercel AI SDK in your Next.js Pages Router application
---

# Assistants

In this section, you will learn to use OpenAI's Assistant API along with the `useAssistant` hook from `ai/react`.

<IndexCards
cards={[
{
title: 'Stream Assistant Response',
description:
'Learn to stream assistant response from OpenAI Assistant API.',
href: '/examples/next-pages/assistants/stream-assistant-responses',
},
{
title: 'Stream Assistant Response with Tools',
description:
'Learn to stream assistant responses with tool calls from OpenAI Assistant API.',
href: '/examples/next-pages/assistants/stream-assistant-responses-with-tools',
},
]}
/>

0 comments on commit 9fb3b4d

Please sign in to comment.