-
Notifications
You must be signed in to change notification settings - Fork 15.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Langserve example from Quickstart tutorial not working #14757
Comments
🤖 Hello @AvijeetPrasad! I'm Dosu, a friendly bot here to lend a digital hand with your LangChain issues. While the human maintainers are away, I'm here to help you debug, answer questions, and guide you if you're looking to contribute. Looking forward to assisting you! The "404 Not Found" error you're encountering is likely due to the placeholder Here's the problematic line in your code: add_routes(app, NotImplemented) To resolve this issue, you should replace add_routes(app, my_chain) Please ensure that As for the function I hope this helps! If you have any other questions or if something is unclear, please let me know. SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Here's the code from the quickstart page for reference #!/usr/bin/env python
from typing import List
from fastapi import FastAPI
from langchain.prompts import ChatPromptTemplate
from langchain.chat_models import ChatOpenAI
from langchain.schema import BaseOutputParser
from langserve import add_routes
# 1. Chain definition
class CommaSeparatedListOutputParser(BaseOutputParser[List[str]]):
"""Parse the output of an LLM call to a comma-separated list."""
def parse(self, text: str) -> List[str]:
"""Parse the output of an LLM call."""
return text.strip().split(", ")
template = """You are a helpful assistant who generates comma separated lists.
A user will pass in a category, and you should generate 5 objects in that category in a comma separated list.
ONLY return a comma separated list, and nothing more."""
human_template = "{text}"
chat_prompt = ChatPromptTemplate.from_messages([
("system", template),
("human", human_template),
])
category_chain = chat_prompt | ChatOpenAI() | CommaSeparatedListOutputParser()
# 2. App definition
app = FastAPI(
title="LangChain Server",
version="1.0",
description="A simple API server using LangChain's Runnable interfaces",
)
# 3. Adding chain route
add_routes(
app,
category_chain,
path="/category_chain",
)
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="localhost", port=8000) |
I had exactly the same problem.
|
any solutions found for the above error ?? |
any solutions please |
Try following the suggestion in the error message, to downgrade Per above, this would be |
I am still having issues with this. I followed the tutorial from Langchain and am still getting this issue. tutorial link: https://www.youtube.com/watch?v=OV2ZIWFBe0s&ab_channel=LangChain |
change to |
@sishuoyang |
when you run your server.py , you need to configure in url to this path of your endpoint : 1. the definition of our chain that we just built above2. Our FastAPI app3. definition of a route from which to serve the chain , which is done with "langserve.add_routes"from typing import List from fastapi import FastAPI from langchain_openai import ChatOpenAI (for openai)from langchain_groq import ChatGroq os.environ["GROQ_API_KEY"] = groq_api 1. Create prompt templatesystem_template = "Translate the following into {language}:" 2. Create modelmodel = ChatGroq(model="llama3-70b-8192") 3. Create parserparser = StrOutputParser() 4. Create chainchain = prompt_template | model | parser 4. App definitionapp = FastAPI( 5. Adding chain routeadd_routes( if name == "main":
` |
System Info
Name Version Build Channel
langchain 0.0.350 pypi_0 pypi
langchain-cli 0.0.19 pypi_0 pypi
langchain-community 0.0.3 pypi_0 pypi
langchain-core 0.1.1 pypi_0 pypi
langchain-experimental 0.0.47 pypi_0 pypi
python 3.12.0 h47c9636_0_cpython conda-forge
System: macOS 14.2 (Apple M1 chip)
Who can help?
No response
Information
Related Components
Reproduction
serve.py
python serve.py
http://localhost:8000
in browserExpected behavior
Expected to see Langserve UI. Got the following error instead.
LANGSERVE: Playground for chain "/category_chain/" is live at:
LANGSERVE: │
LANGSERVE: └──> /category_chain/playground/
LANGSERVE:
LANGSERVE: See all available routes at /docs/
LANGSERVE:⚠️ Using pydantic 2.5.2. OpenAPI docs for invoke, batch, stream, stream_log endpoints will not be generated. API endpoints and playground should work as expected. If you need to see the docs, you can downgrade to pydantic 1. For example,
pip install pydantic==1.10.13
. See fastapi/fastapi#10360 for details.INFO: Application startup complete.
INFO: Uvicorn running on http://localhost:8000 (Press CTRL+C to quit)
INFO: ::1:58516 - "GET / HTTP/1.1" 404 Not Found
INFO: ::1:58516 - "GET /favicon.ico HTTP/1.1" 404 Not Found
^CINFO: Shutting down
INFO: Waiting for application shutdown.
INFO: Application shutdown complete.
INFO: Finished server process [91610]
The text was updated successfully, but these errors were encountered: