Skip to content

matzar/ai-chatbot

Repository files navigation

Roadmap

Interacting with foundation models through APIs provides enhanced security, allows developers to opt-out of data training by default, offers greater control, and is typically more cost-effective while supporting more tokens.

Currently, the Claude.ai client delivers a superior UI/UX compared to, for example, the ChatGPT client, although the quality of responses can still vary.

This open-source project aims to make all leading foundation models accessible via a simple dropdown menu, enabling you to customize the UI/UX to suit your preferences, deploy it under your own domain, and use it on a day-to-day basis.

Future Development

  • Compare the cost, ease of implementation, and maintenance of Vercel AI SDK and Amazon Bedrock, and choose the best option.
  • Add the ability to select foundation models (FMs) from all leading AI companies, such as AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon, via a simple dropdown menu.

Changelog

20 Aug 24

  • Deployed the app on the private domain using Vercel: matzar.dev

Next.js 14 and App Router-ready AI chatbot.

An open-source AI chatbot app template built with Next.js, the Vercel AI SDK, OpenAI, and Vercel KV.

Features · Model Providers · Deploy Your Own · Running locally · Authors


Features

  • Next.js App Router
  • React Server Components (RSCs), Suspense, and Server Actions
  • Vercel AI SDK for streaming chat UI
  • Support for OpenAI (default), Anthropic, Cohere, Hugging Face, or custom AI chat models and/or LangChain
  • shadcn/ui
  • Chat History, rate limiting, and session storage with Vercel KV
  • NextAuth.js for authentication

Model Providers

This template ships with OpenAI gpt-3.5-turbo as the default. However, thanks to the Vercel AI SDK, you can switch LLM providers to Anthropic, Cohere, Hugging Face, or using LangChain with just a few lines of code.

Deploy Your Own

You can deploy your own version of the Next.js AI Chatbot to Vercel with one click:

Deploy with Vercel

Creating a KV Database Instance

Follow the steps outlined in the quick start guide provided by Vercel. This guide will assist you in creating and configuring your KV database instance on Vercel, enabling your application to interact with it.

Remember to update your environment variables (KV_URL, KV_REST_API_URL, KV_REST_API_TOKEN, KV_REST_API_READ_ONLY_TOKEN) in the .env file with the appropriate credentials provided during the KV database setup.

Running locally

You will need to use the environment variables defined in .env.example to run Next.js AI Chatbot. It's recommended you use Vercel Environment Variables for this, but a .env file is all that is necessary.

Note: You should not commit your .env file or it will expose secrets that will allow others to control access to your various OpenAI and authentication provider accounts.

  1. Install Vercel CLI: npm i -g vercel
  2. Link local instance with Vercel and GitHub accounts (creates .vercel directory): vercel link
  3. Download your environment variables: vercel env pull
pnpm install
pnpm dev

Your app template should now be running on localhost:3000.

Authors

This library is created by Vercel and Next.js team members, with contributions from:

About

AI chatbot built with Next.js and Vercel AI SDK

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages