Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding AI conversation section #8094

Open
wants to merge 4 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion cspell.json
Original file line number Diff line number Diff line change
Expand Up @@ -1613,7 +1613,8 @@
"voteField",
"ampx",
"autodetection",
"jamba"
"jamba",
"knowledgebases"

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: shouldn't this be two words?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nevermind, this is in a path

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup

],
"flagWords": ["hte", "full-stack", "Full-stack", "Full-Stack", "sudo"],
"patterns": [
Expand Down
182 changes: 182 additions & 0 deletions src/pages/[platform]/ai/conversation/context/index.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,182 @@
import { getCustomStaticPath } from "@/utils/getCustomStaticPath";

export const meta = {
title: "Context",
description:
"How to pass client-side context to the LLM to help it respond.",
platforms: [
"javascript",
"react-native",
"angular",
"nextjs",
"react",
"vue",
],
};

export const getStaticPaths = async () => {
return getCustomStaticPath(meta.platforms);
};

export function getStaticProps(context) {
return {
props: {
platform: context.params.platform,
meta,
showBreadcrumbs: false,
},
};
}



For LLMs to provide high-quality answers to users' questions, they need to have the right information. Sometimes this information is contextual, based on the user or the state of the application. To allow for this, you can send `aiContext` with any user message to the LLM, which can be any unstructured or structured data that might be useful.

<InlineFilter filters={["javascript","vue","angular"]}>

```ts
import { generateClient } from "aws-amplify/data";
import type { Schema } from "../amplify/data/resource";

const client = generateClient<Schema>({ authMode: 'userPool' });

const { data: conversation } = await client.conversations.chat.create();

conversation.sendMessage({
content: [{ text: "hello" }],
// aiContext can be any shape
aiContext: {
username: "danny"
}
})
```

</InlineFilter>


<InlineFilter filters={["react-native"]}>

```tsx
export default function Chat() {
const [
{
data: { messages },
isLoading,
},
sendMessage,
] = useAIConversation('chat');

function handleSendMessage(message) {
sendMessage({
...message,
// this can be any object that can be stringified
aiContext: {
currentTime: new Date().toLocaleTimeString()
}
})
}

return (
//...
)
}
```

</InlineFilter>


<InlineFilter filters={["react", "nextjs"]}>

```tsx
function Chat() {
const [
{
data: { messages },
isLoading,
},
sendMessage,
] = useAIConversation('chat');

return (
<AIConversation
messages={messages}
isLoading={isLoading}
handleSendMessage={sendMessage}
// This will let the LLM know about the current state of this application
// so it can better respond to questions
aiContext={() => {
return {
currentTime: new Date().toLocaleTimeString(),
};
}}
/>
);
}
```


The function passed to the `aiContext` prop will be run immediately before the request is sent in order to get the most up to date information.

You can use React context or other state management systems to update the data passed to `aiContext`. Using React context we can provide more information about the current state of the application:

```tsx
// Create a context to share state across components
const DataContext = React.createContext<{
data: any;
setData: (value: React.SetStateAction<any>) => void;
}>({ data: {}, setData: () => {} });

// Create a component that updates the shared state
function Counter() {
const { data, setData } = React.useContext(AIContext);
const count = data.count ?? 0;
return (
<Button onClick={() => setData({ ...data, count: count + 1 })}>
{count}
</Button>
);
}

// reference shared data in aiContext
function Chat() {
const { data } = React.useContext(DataContext);
const [
{
data: { messages },
isLoading,
},
sendMessage,
] = useAIConversation('pirateChat');

return (
<AIConversation
messages={messages}
isLoading={isLoading}
handleSendMessage={sendMessage}
// This will let the LLM know about the current state of this application
// so it can better respond to questions
aiContext={() => {
return {
...data,
currentTime: new Date().toLocaleTimeString(),
};
}}
/>
);
}

export default function Example() {
const [data, setData] = React.useState({});
return (
<Authenticator>
<DataContext.Provider value={{ data, setData }}>
<Counter />
<Chat />
</DataContext.Provider>
</Authenticator>
)
}
```


</InlineFilter>
106 changes: 106 additions & 0 deletions src/pages/[platform]/ai/conversation/history/index.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
import { getCustomStaticPath } from "@/utils/getCustomStaticPath";

export const meta = {
title: "Conversation History",
description:
"Learn how Amplify AI kit takes care of conversation history",
platforms: [
"javascript",
"react-native",
"angular",
"nextjs",
"react",
"vue",
],
};

export const getStaticPaths = async () => {
return getCustomStaticPath(meta.platforms);
};

export function getStaticProps(context) {
return {
props: {
platform: context.params.platform,
meta,
showBreadcrumbs: false,
},
};
}

The Amplify AI kit automatically and securely stores conversation history per user so you can easily resume past conversations.

<Callout>

If you are looking for a quick way to get stared with conversation history, [this example project](https://github.com/aws-samples/amplify-ai-examples/tree/main/claude-ai) has a similar interface to ChatGPT or Claude where users see past conversations in a sidebar they can manage.

</Callout>

When you define a conversation route in your Amplify data schema, the Amplify AI kit turns that into 2 data models: `Conversation` and `Message`. The `Conversation` model functions mostly the same way as other data models defined in your schema. You can list and filter them (because they use owner-based authorization users will only see their conversations) and you can get a specific conversation by ID. Then once you have a conversation instance you can load the messages in it if there are any, send messages to it, and subscribe to the stream events being sent back.


## Listing conversations

To list all the conversations a user has you can use the `.list()` method. It works the same way as any other Amplify data model would. You can optionally pass a `limit` or `nextToken`.

```ts
const { data: conversations } = await client.conversations.chat.list()
```

The `updatedAt` field gets updated when new messages are sent, so you can use that to see which conversation had the most recent message. Conversations retrieved via `.list()` are sorted in descending order by `updatedAt`.

### Pagination
The result of `.list()` contains a `nextToken` property. This can be used to retrieve subsequent pages of conversations.

```ts
const { data: conversations, nextToken } = await client.conversations.chat.list();

// retrieve next page
if (nextToken) {
const { data: nextPageConversations } = await client.conversations.chat.list({
nextToken
});
}

Conversations also have `name` and `metadata` fields you can use to more easily find and resume past conversations. `name` is a string and `metadata` is a JSON object so you can store any extra information you need.

## Resuming conversations

You can resume a conversation by calling the `.get()` method with a conversation ID. Both `.create()` and `.get()` return the a conversation instance.

<InlineFilter filters={['javascript','vue','angular']}>

```ts
// list all conversations a user has
// make sure the user has been authenticated with Amplify Auth
const conversationList = await client.conversations.conversation.list();

// Retrieve a specific conversation
const { data: conversation } = await client.conversations.chat.get({ id: conversationList[0].id });

// list the existing messages in the conversation
const { data: messages } = await conversation.listMessages();

// You can now send a message to the conversation
conversation.sendMessage({
content: [
{text: "hello"}
]
})
```

</InlineFilter>

<InlineFilter filters={['react','nextjs','react-native']}>

```tsx
export function Chat({ id }) {
const [
data: { messages }
handleSendMessage,
] = useAIConversation('chat', { id })
}
```

</InlineFilter>

79 changes: 79 additions & 0 deletions src/pages/[platform]/ai/conversation/index.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
import { getChildPageNodes } from '@/utils/getChildPageNodes';
import { getCustomStaticPath } from "@/utils/getCustomStaticPath";

export const meta = {
title: "Conversation",
description:
"Learn about conversational AI patterns and how to implement them in Amplify.",
route: '/[platform]/ai/conversation',
platforms: [
"javascript",
"react-native",
"angular",
"nextjs",
"react",
"vue",
],
};

export const getStaticPaths = async () => {
return getCustomStaticPath(meta.platforms);
};

export function getStaticProps(context) {
const childPageNodes = getChildPageNodes(meta.route);
return {
props: {
meta,
childPageNodes,
showBreadcrumbs: false,
}
};
}


The conversation route simplifies the creation of AI-powered conversation interfaces in your application. It automatically sets up the necessary AppSync API components and Lambda functions to handle streaming multi-turn interactions with Amazon Bedrock foundation models.

## Key Components

1. **AppSync API**: Gateway to the conversation route.
- Create new conversation route instance.
- Send messages to conversation route instance.
- Subscribe to real-time updates for assistant responses.

2. **Lambda Function**: Bridge between AppSync and Amazon Bedrock.
- Retrieve conversation instance history.
- Invokes Bedrock's /converse endpoint.
- Handles tool use responses by invoking AppSync queries.

3. **DynamoDB**: Stores conversation and message data
- Conversations are scoped to a specific application user.

## Authentication Flow

1. The user's OIDC access token is passed from the client to AppSync
2. AppSync forwards this token to the Lambda function
3. The Lambda function uses the token to authenticate requests back to AppSync

## Usage Scenarios

Each of the following scenarios have safeguards in place to mitigate risks associated with invoking tools on behalf of the user, including:

- Amazon CloudWatch log group redacting OIDC access tokens for logs from the Lambda function.
- IAM policies that limit the Lambda function's ability to access other resources.


## Data Flow

1. User sends a message via the AppSync mutation
2. AppSync triggers the Lambda function (default or custom)
3. Lambda processes the message and invokes Bedrock's /converse endpoint
a. If response is a tool use, Lambda function invokes applicable AppSync query.
4. Lambda sends assistant response back to AppSync
5. AppSync sends the response to subscribed clients

This design allows for real-time, scalable conversations while ensuring that the Lambda function's data access matches that of the application user.

## Next Steps

<Overview childPageNodes={props.childPageNodes} />
Loading
Loading