Skip to content

aws-samples/stream-ai-assistant-using-bedrock-converse-with-tools

Create an AI Assistant with AWS Amplify, Amazon Bedrock w/ Tools, AI SDK and LangChain

Table of Contents

Overview

This project demonstrates how to build an AI Assistant using AWS Amplify, Amazon Bedrock, Vercel AI SDK, and LangChain.js. The AI Assistant is designed to call tools and can interact with images.

ai-assistant-demo.mp4

Key Features

  • AWS Amplify Integration: Seamlessly integrates with Amplify hosting and backend services, facilitating streamlined application deployment.
  • Amazon Bedrock: Enhances Large Language Model (LLM) capabilities via the Amazon Bedrock Converse API, supporting features like tool calling and image description.
  • Serverless Streaming: Uses AWS Lambda for response streaming, ensuring optimal performance and scalability.
  • AI SDK: Leverages Vercel AI SDK to connect the application with the LLM, delivering a refined user experience.
  • LangChain Support: Incorporates LangChain.js to leverage its comprehensive ecosystem and capabilities.

Architecture

Below is an overview of the application architecture:

Architecture Diagram

Tech Stack

Frontend

This application is based on the AWS Amplify React starter template. The design is crafted with Tailwind CSS and shadcn components, using a dashboard template for a sleek and efficient UI.

For creating a conversational user interface, the useChat() hook from Vercel AI SDK is employed.

Backend

The backend is built with AWS services:

  • Lambda Functions: These functions call the Bedrock Converse API to send and receive messages from Amazon Bedrock models. Node.js 20 serves as the runtime environment.
  • Model Selection: Choose from three models supporting tool calling:
    • Anthropic Claude 3 Haiku
    • Anthropic Claude 3 Sonnet
    • Anthropic Claude 3.5 Sonnet
  • Authentication: Managed via Amplify Auth and Amazon Cognito.
  • Response Streaming: Conversations are streamed through Lambda functions with streaming response.

Two Lambda functions are available to interact with Bedrock Converse API:

The interface allows you to select between these two frameworks and test their respective approaches.

Deployment Guide

Requirements

Before deploying the assistant, ensure you have access to the following foundation models on Amazon Bedrock:

  • Anthropic Claude 3 Haiku
  • Anthropic Claude 3 Sonnet
  • Anthropic Claude 3.5 Sonnet

Refer to this guide for details. The project must be deployed in the same AWS region where these models are available.

Deploy

To deploy the project to your AWS account, first create a repository in your GitHub account using this project as a starter:

Create repository from template 🪄

Use the form in GitHub to finalize your repo's creation. Now that the repository has been created, deploy it with Amplify:

Deploy to AWS 🚀

Select GitHub. After you give Amplify access to your GitHub account via the popup window, pick the repository and main branch to deploy. Make no other changes and click through the flow to "Save and deploy".

When the build completes, visit the newly deployed branch by selecting "Visit deployed URL".

Local Development

For local development, you'll use the Amplify cloud sandbox, which offers an isolated environment with real-time updates to your cloud resources.

You'll need Node.js 20 for this:

  1. Install dependencies:

    npm install
  2. Launch the sandbox environment:

    npx ampx sandbox
  3. Open a new terminal tab and start the development server:

    npm run dev
  4. You can now access http://localhost:5173 and make changes.

Security

This project leverages Lambda functions to stream responses using Lambda URLs. These functions are protected behind a CloudFront distribution, with a Lambda@Edge function in place to verify user access tokens.

To further secure these endpoints, it's advisable to implement AWS WAF. Refer to the documentation for guidance on adding WAF to your CloudFront distribution.

CORS settings are permissive by default to facilitate deployment. You can improve security by only allowing your application's domain to access the /ai and /langchain endpoints.

Currently, all CloudFront distributions within your AWS account have permission to invoke the Lambda URLs. For increased security, you can limit this permission to only the specific CloudFront distribution used by this project. To avoid a circular reference error, an additional deployment step is required. You can use the aws lambda add-permission CLI command for this purpose, as detailed in this documentation.

Contributing

See CONTRIBUTING for more information.

License

This library is licensed under the MIT-0 License. See the LICENSE file.

About

No description, website, or topics provided.

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published