Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError when using add_routes() with a RunnableWithMessageHistory wrapper #742

Open
rfoucart opened this issue Sep 5, 2024 · 1 comment

Comments

@rfoucart
Copy link

rfoucart commented Sep 5, 2024

Hi all,

I don't know if I should post here or in the langchain-core issues (and I might do both).

I've got this error since I updated all the langchain-related packages.
It seems that the add_routes() method no longer works with a RunnableWithMessageHistory.
Here is the sample code :

import os

from fastapi import APIRouter
from langchain_anthropic.chat_models import ChatAnthropic
from langchain_core.chat_history import BaseChatMessageHistory, InMemoryChatMessageHistory
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
from langchain_core.runnables import RunnableWithMessageHistory
from langserve import add_routes

router = APIRouter(prefix="/llm", tags=["llm"])

store = {}


def get_session_history(session_id: str) -> BaseChatMessageHistory:
    if session_id not in store:
        store[session_id] = InMemoryChatMessageHistory()
    return store[session_id]


qa_system_prompt = """You are an assistant for question-answering on various topics.
If you don't know the answer, just say that you don't know and ask for more informations.
Answer in french.
"""
qa_prompt = ChatPromptTemplate.from_messages(
    [
        ("system", qa_system_prompt),
        MessagesPlaceholder("chat_history"),
        ("human", "{input}"),
    ]
)
llm = ChatAnthropic(model="claude-3-5-sonnet-20240620", temperature=0, api_key=os.getenv("ANTHROPIC_API_KEY"))
chain = qa_prompt | llm
conversational_chain = RunnableWithMessageHistory(
    chain,
    get_session_history,
    input_messages_key="input",
    history_messages_key="chat_history",
    output_messages_key="output",
)

if __name__ == "__main__":
    question = "What is the size of the Atlantic Ocean ?"
    response = conversational_chain.invoke(
        {"input": question},
        config={
            "configurable": {"session_id": "14"}
        },
    )
    print(f"Question: {question}\nAnswer: {response.content}")
    follow_up_question = "What is its deepest point ?"
    response = conversational_chain.invoke(
        {"input": follow_up_question},
        config={
            "configurable": {"session_id": "14"}
        },
    )
    print(f"Question: {follow_up_question}\nAnswer: {response.content}")
else:
    add_routes(app=router, runnable=conversational_chain, path="/chat")

If I run this script as main, it works perfectly as intented. If not, it throws this error :

Traceback (most recent call last):
  File "~/backend/app/llm_router.py", line 42, in <module>
    add_routes(app=router, runnable=conversational_chain, path="/chat")
  File "~/backend/.venv/lib/python3.12/site-packages/langserve/server.py", line 443, in add_routes
    api_handler = APIHandler(
                  ^^^^^^^^^^^
  File "~/backend/.venv/lib/python3.12/site-packages/langserve/api_handler.py", line 671, in __init__
    output_type_ = _resolve_model(
                   ^^^^^^^^^^^^^^^
  File "~/backend/.venv/lib/python3.12/site-packages/langserve/api_handler.py", line 338, in _resolve_model
    hash_ = model.schema_json()
            ^^^^^^^^^^^^^^^^^^^
  File "~/backend/.venv/lib/python3.12/site-packages/pydantic/v1/main.py", line 675, in schema_json
    cls.schema(by_alias=by_alias, ref_template=ref_template), default=pydantic_encoder, **dumps_kwargs
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/backend/.venv/lib/python3.12/site-packages/pydantic/v1/main.py", line 664, in schema
    s = model_schema(cls, by_alias=by_alias, ref_template=ref_template)
        ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "~/backend/.venv/lib/python3.12/site-packages/pydantic/v1/schema.py", line 188, in model_schema
    m_schema, m_definitions, nested_models = model_process_schema(
                                             ^^^^^^^^^^^^^^^^^^^^^
  File "~/backend/.venv/lib/python3.12/site-packages/pydantic/v1/schema.py", line 581, in model_process_schema
    m_schema, m_definitions, nested_models = model_type_schema(
                                             ^^^^^^^^^^^^^^^^^^
  File "~/backend/.venv/lib/python3.12/site-packages/pydantic/v1/schema.py", line 622, in model_type_schema
    f_schema, f_definitions, f_nested_models = field_schema(
                                               ^^^^^^^^^^^^^
  File "~/backend/.venv/lib/python3.12/site-packages/pydantic/v1/schema.py", line 255, in field_schema
    f_schema, f_definitions, f_nested_models = field_type_schema(
                                               ^^^^^^^^^^^^^^^^^^
  File "~/backend/.venv/lib/python3.12/site-packages/pydantic/v1/schema.py", line 527, in field_type_schema
    f_schema, f_definitions, f_nested_models = field_singleton_schema(
                                               ^^^^^^^^^^^^^^^^^^^^^^^
  File "~/backend/.venv/lib/python3.12/site-packages/pydantic/v1/schema.py", line 926, in field_singleton_schema
    if issubclass(field_type, BaseModel):
       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen abc>", line 123, in __subclasscheck__
TypeError: issubclass() arg 1 must be a class

add_routes() fails with the following packages versions :

fastapi==0.112.3
langchain==0.2.16
langchain-anthropic==0.1.23
langchain-community==0.2.16
langchain-core==0.2.38
langserve==0.2.3
langsmith==0.1.114
pydantic==2.8.2
pydantic-core==2.20.1

but works fine with these versions :

fastapi==0.112.1
langchain==0.2.8
langchain-anthropic==0.1.20
langchain-community==0.2.7
langchain-core==0.2.19
langserve==0.2.2
langsmith==0.1.99
pydantic==2.8.2
pydantic_core==2.20.1

I put a breakpoint to see what field_type contains when the code breaks.
Here is the list of the successive field_type values I saw before the error :

  • <class 'langchain_core.messages.base.BaseMessage'>
  • <class 'langchain_core.messages.base.BaseMessage'>
  • ForwardRef('Runnable')

I don't get where this bug comes from, any help would be appreciated.
Thanks !

@eyurtsev
Copy link
Collaborator

Should be fixed in the upcoming 0.3 release

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants