Skip to content

Latest commit

 

History

History
15 lines (8 loc) · 1.32 KB

README.md

File metadata and controls

15 lines (8 loc) · 1.32 KB

LangGraph Cloud Example

This is an example agent to deploy with LangGraph Cloud.

Tip

If you would rather use requirements.txt for managing dependencies in your LangGraph Cloud project, please check out this repository.

LangGraph is a library for building stateful, multi-actor applications with LLMs. The main use cases for LangGraph are conversational agents, and long-running, multi-step LLM applications or any LLM application that would benefit from built-in support for persistent checkpoints, cycles and human-in-the-loop interactions (ie. LLM and human collaboration).

LangGraph shortens the time-to-market for developers using LangGraph, with a one-liner command to start a production-ready HTTP microservice for your LangGraph applications, with built-in persistence. This lets you focus on the logic of your LangGraph graph, and leave the scaling and API design to us. The API is inspired by the OpenAI assistants API, and is designed to fit in alongside your existing services.

In order to deploy this agent to LangGraph Cloud you will want to first fork this repo. After that, you can follow the instructions here to deploy to LangGraph Cloud.