-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding AI conversation section #8094
Open
dbanksdesign
wants to merge
4
commits into
main
Choose a base branch
from
ai-conversation-section
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
+952
−1
Open
Changes from 3 commits
Commits
Show all changes
4 commits
Select commit
Hold shift + click to select a range
File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,182 @@ | ||
import { getCustomStaticPath } from "@/utils/getCustomStaticPath"; | ||
|
||
export const meta = { | ||
title: "Context", | ||
description: | ||
"How to pass client-side context to the LLM to help it respond.", | ||
platforms: [ | ||
"javascript", | ||
"react-native", | ||
"angular", | ||
"nextjs", | ||
"react", | ||
"vue", | ||
], | ||
}; | ||
|
||
export const getStaticPaths = async () => { | ||
return getCustomStaticPath(meta.platforms); | ||
}; | ||
|
||
export function getStaticProps(context) { | ||
return { | ||
props: { | ||
platform: context.params.platform, | ||
meta, | ||
showBreadcrumbs: false, | ||
}, | ||
}; | ||
} | ||
|
||
|
||
|
||
For LLMs to provide high-quality answers to users' questions, they need to have the right information. Sometimes this information is contextual, based on the user or the state of the application. To allow for this, you can send `aiContext` with any user message to the LLM, which can be any unstructured or structured data that might be useful. | ||
|
||
<InlineFilter filters={["javascript","vue","angular"]}> | ||
|
||
```ts | ||
import { generateClient } from "aws-amplify/data"; | ||
import type { Schema } from "../amplify/data/resource"; | ||
|
||
const client = generateClient<Schema>({ authMode: 'userPool' }); | ||
|
||
const { data: conversation } = await client.conversations.chat.create(); | ||
|
||
conversation.sendMessage({ | ||
content: [{ text: "hello" }], | ||
// aiContext can be any shape | ||
aiContext: { | ||
username: "danny" | ||
} | ||
}) | ||
``` | ||
|
||
</InlineFilter> | ||
|
||
|
||
<InlineFilter filters={["react-native"]}> | ||
|
||
```tsx | ||
export default function Chat() { | ||
const [ | ||
{ | ||
data: { messages }, | ||
isLoading, | ||
}, | ||
sendMessage, | ||
] = useAIConversation('chat'); | ||
|
||
function handleSendMessage(message) { | ||
sendMessage({ | ||
...message, | ||
// this can be any object that can be stringified | ||
aiContext: { | ||
currentTime: new Date().toLocaleTimeString() | ||
} | ||
}) | ||
} | ||
|
||
return ( | ||
//... | ||
) | ||
} | ||
``` | ||
|
||
</InlineFilter> | ||
|
||
|
||
<InlineFilter filters={["react", "nextjs"]}> | ||
|
||
```tsx | ||
function Chat() { | ||
const [ | ||
{ | ||
data: { messages }, | ||
isLoading, | ||
}, | ||
sendMessage, | ||
] = useAIConversation('chat'); | ||
|
||
return ( | ||
<AIConversation | ||
messages={messages} | ||
isLoading={isLoading} | ||
handleSendMessage={sendMessage} | ||
// This will let the LLM know about the current state of this application | ||
// so it can better respond to questions | ||
aiContext={() => { | ||
return { | ||
currentTime: new Date().toLocaleTimeString(), | ||
}; | ||
}} | ||
/> | ||
); | ||
} | ||
``` | ||
|
||
|
||
The function passed to the `aiContext` prop will be run immediately before the request is sent in order to get the most up to date information. | ||
|
||
You can use React context or other state management systems to update the data passed to `aiContext`. Using React context we can provide more information about the current state of the application: | ||
|
||
```tsx | ||
// Create a context to share state across components | ||
const DataContext = React.createContext<{ | ||
data: any; | ||
setData: (value: React.SetStateAction<any>) => void; | ||
}>({ data: {}, setData: () => {} }); | ||
|
||
// Create a component that updates the shared state | ||
function Counter() { | ||
const { data, setData } = React.useContext(AIContext); | ||
const count = data.count ?? 0; | ||
return ( | ||
<Button onClick={() => setData({ ...data, count: count + 1 })}> | ||
{count} | ||
</Button> | ||
); | ||
} | ||
|
||
// reference shared data in aiContext | ||
function Chat() { | ||
const { data } = React.useContext(DataContext); | ||
const [ | ||
{ | ||
data: { messages }, | ||
isLoading, | ||
}, | ||
sendMessage, | ||
] = useAIConversation('pirateChat'); | ||
|
||
return ( | ||
<AIConversation | ||
messages={messages} | ||
isLoading={isLoading} | ||
handleSendMessage={sendMessage} | ||
// This will let the LLM know about the current state of this application | ||
// so it can better respond to questions | ||
aiContext={() => { | ||
return { | ||
...data, | ||
currentTime: new Date().toLocaleTimeString(), | ||
}; | ||
}} | ||
/> | ||
); | ||
} | ||
|
||
export default function Example() { | ||
const [data, setData] = React.useState({}); | ||
return ( | ||
<Authenticator> | ||
<DataContext.Provider value={{ data, setData }}> | ||
<Counter /> | ||
<Chat /> | ||
</DataContext.Provider> | ||
</Authenticator> | ||
) | ||
} | ||
``` | ||
|
||
|
||
</InlineFilter> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,106 @@ | ||
import { getCustomStaticPath } from "@/utils/getCustomStaticPath"; | ||
|
||
export const meta = { | ||
title: "Conversation History", | ||
description: | ||
"Learn how Amplify AI kit takes care of conversation history", | ||
platforms: [ | ||
"javascript", | ||
"react-native", | ||
"angular", | ||
"nextjs", | ||
"react", | ||
"vue", | ||
], | ||
}; | ||
|
||
export const getStaticPaths = async () => { | ||
return getCustomStaticPath(meta.platforms); | ||
}; | ||
|
||
export function getStaticProps(context) { | ||
return { | ||
props: { | ||
platform: context.params.platform, | ||
meta, | ||
showBreadcrumbs: false, | ||
}, | ||
}; | ||
} | ||
|
||
The Amplify AI kit automatically and securely stores conversation history per user so you can easily resume past conversations. | ||
|
||
<Callout> | ||
|
||
If you are looking for a quick way to get stared with conversation history, [this example project](https://github.com/aws-samples/amplify-ai-examples/tree/main/claude-ai) has a similar interface to ChatGPT or Claude where users see past conversations in a sidebar they can manage. | ||
|
||
</Callout> | ||
|
||
When you define a conversation route in your Amplify data schema, the Amplify AI kit turns that into 2 data models: `Conversation` and `Message`. The `Conversation` model functions mostly the same way as other data models defined in your schema. You can list and filter them (because they use owner-based authorization users will only see their conversations) and you can get a specific conversation by ID. Then once you have a conversation instance you can load the messages in it if there are any, send messages to it, and subscribe to the stream events being sent back. | ||
|
||
|
||
## Listing conversations | ||
|
||
To list all the conversations a user has you can use the `.list()` method. It works the same way as any other Amplify data model would. You can optionally pass a `limit` or `nextToken`. | ||
|
||
```ts | ||
const { data: conversations } = await client.conversations.chat.list() | ||
``` | ||
|
||
The `updatedAt` field gets updated when new messages are sent, so you can use that to see which conversation had the most recent message. Conversations retrieved via `.list()` are sorted in descending order by `updatedAt`. | ||
|
||
### Pagination | ||
The result of `.list()` contains a `nextToken` property. This can be used to retrieve subsequent pages of conversations. | ||
|
||
```ts | ||
const { data: conversations, nextToken } = await client.conversations.chat.list(); | ||
|
||
// retrieve next page | ||
if (nextToken) { | ||
const { data: nextPageConversations } = await client.conversations.chat.list({ | ||
nextToken | ||
}); | ||
} | ||
|
||
Conversations also have `name` and `metadata` fields you can use to more easily find and resume past conversations. `name` is a string and `metadata` is a JSON object so you can store any extra information you need. | ||
|
||
## Resuming conversations | ||
|
||
You can resume a conversation by calling the `.get()` method with a conversation ID. Both `.create()` and `.get()` return the a conversation instance. | ||
|
||
<InlineFilter filters={['javascript','vue','angular']}> | ||
|
||
```ts | ||
// list all conversations a user has | ||
// make sure the user has been authenticated with Amplify Auth | ||
const conversationList = await client.conversations.conversation.list(); | ||
|
||
// Retrieve a specific conversation | ||
const { data: conversation } = await client.conversations.chat.get({ id: conversationList[0].id }); | ||
|
||
// list the existing messages in the conversation | ||
const { data: messages } = await conversation.listMessages(); | ||
|
||
// You can now send a message to the conversation | ||
conversation.sendMessage({ | ||
content: [ | ||
{text: "hello"} | ||
] | ||
}) | ||
``` | ||
|
||
</InlineFilter> | ||
|
||
<InlineFilter filters={['react','nextjs','react-native']}> | ||
|
||
```tsx | ||
export function Chat({ id }) { | ||
const [ | ||
data: { messages } | ||
handleSendMessage, | ||
] = useAIConversation('chat', { id }) | ||
} | ||
``` | ||
|
||
</InlineFilter> | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,79 @@ | ||
import { getChildPageNodes } from '@/utils/getChildPageNodes'; | ||
import { getCustomStaticPath } from "@/utils/getCustomStaticPath"; | ||
|
||
export const meta = { | ||
title: "Conversation", | ||
description: | ||
"Learn about conversational AI patterns and how to implement them in Amplify.", | ||
route: '/[platform]/ai/conversation', | ||
platforms: [ | ||
"javascript", | ||
"react-native", | ||
"angular", | ||
"nextjs", | ||
"react", | ||
"vue", | ||
], | ||
}; | ||
|
||
export const getStaticPaths = async () => { | ||
return getCustomStaticPath(meta.platforms); | ||
}; | ||
|
||
export function getStaticProps(context) { | ||
const childPageNodes = getChildPageNodes(meta.route); | ||
return { | ||
props: { | ||
meta, | ||
childPageNodes, | ||
showBreadcrumbs: false, | ||
} | ||
}; | ||
} | ||
|
||
|
||
The conversation route simplifies the creation of AI-powered conversation interfaces in your application. It automatically sets up the necessary AppSync API components and Lambda functions to handle streaming multi-turn interactions with Amazon Bedrock foundation models. | ||
|
||
## Key Components | ||
|
||
1. **AppSync API**: Gateway to the conversation route. | ||
- Create new conversation route instance. | ||
- Send messages to conversation route instance. | ||
- Subscribe to real-time updates for assistant responses. | ||
|
||
2. **Lambda Function**: Bridge between AppSync and Amazon Bedrock. | ||
- Retrieve conversation instance history. | ||
- Invokes Bedrock's /converse endpoint. | ||
- Handles tool use responses by invoking AppSync queries. | ||
|
||
3. **DynamoDB**: Stores conversation and message data | ||
- Conversations are scoped to a specific application user. | ||
|
||
## Authentication Flow | ||
|
||
1. The user's OIDC access token is passed from the client to AppSync | ||
2. AppSync forwards this token to the Lambda function | ||
3. The Lambda function uses the token to authenticate requests back to AppSync | ||
|
||
## Usage Scenarios | ||
|
||
Each of the following scenarios have safeguards in place to mitigate risks associated with invoking tools on behalf of the user, including: | ||
|
||
- Amazon CloudWatch log group redacting OIDC access tokens for logs from the Lambda function. | ||
- IAM policies that limit the Lambda function's ability to access other resources. | ||
|
||
|
||
## Data Flow | ||
|
||
1. User sends a message via the AppSync mutation | ||
2. AppSync triggers the Lambda function (default or custom) | ||
3. Lambda processes the message and invokes Bedrock's /converse endpoint | ||
a. If response is a tool use, Lambda function invokes applicable AppSync query. | ||
4. Lambda sends assistant response back to AppSync | ||
5. AppSync sends the response to subscribed clients | ||
|
||
This design allows for real-time, scalable conversations while ensuring that the Lambda function's data access matches that of the application user. | ||
|
||
## Next Steps | ||
|
||
<Overview childPageNodes={props.childPageNodes} /> |
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: shouldn't this be two words?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nevermind, this is in a path
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yup