You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I searched the LangChain documentation with the integrated search.
I used the GitHub search to find a similar question and didn't find it.
I am sure that this is a bug in LangChain rather than my code.
The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
Example Code
importosimportrequestsimportyamlos.environ["OPENAI_API_KEY"] ="sk-REDACTED"fromlangchain_community.agent_toolkits.openapiimportplannerfromlangchain_openai.chat_modelsimportChatOpenAIfromlangchain_community.agent_toolkits.openapi.specimportreduce_openapi_specfromlangchain.requestsimportRequestsWrapperfromrequests.packages.urllib3.exceptionsimportInsecureRequestWarning# Ignore SSL warningsrequests.packages.urllib3.disable_warnings(InsecureRequestWarning)
withopen("/home/ehkim/git/testprj/code_snippet/swagger.yaml") asf:
data=yaml.load(f, Loader=yaml.FullLoader)
swagger_api_spec=reduce_openapi_spec(data)
defconstruct_superset_aut_headers(url=None):
importrequestsurl="https://superset.mydomain.com/api/v1/security/login"payload= {
"username": "myusername",
"password": "mypassword",
"provider": "db",
"refresh": True
}
headers= {
"Content-Type": "application/json"
}
response=requests.post(url, json=payload, headers=headers, verify=False)
data=response.json()
return {"Authorization": f"Bearer {data['access_token']}"}
fromlangchain.globalsimportset_debugset_debug(True)
llm=ChatOpenAI(model='gpt-4o')
swagger_requests_wrapper=RequestsWrapper(headers=construct_superset_aut_headers(), verify=False)
superset_agent=planner.create_openapi_agent(
swagger_api_spec,
swagger_requests_wrapper,
llm, allow_dangerous_requests=True,
handle_parsing_errors=True)
superset_agent.run(
""" 1. Get the dataset using the following information. (tool: requests_post, API: /api/v1/dataset/get_or_create/, database_id: 1, table_name: raw_esg_volume, response : {{'result' : {{'table_id': (dataset_id)}}}}) 2. Retrieve the dataset information obtained in step 1. (tool: requests_get, API: /api/v1/dataset/dataset/{{dataset_id}}/, params: None) 3. Create a chart referencing the dataset obtained in step 2. The chart should plot the trend of total, online_news, and (total - online_news) values as a line chart. (tool: requests_post, API: /api/v1/chart/, database_id: 1) 4. Return the URL of the created chart. https://superset.mydomain.com/explore/?slice_id={{chart_id}} When specifying the action, only write the tool name without any additional explanation. """
)
(myenv) ehkim@ehkim-400TEA-400SEA:~/git/testprj/code_snippet$ python openapi-agent.py
/home/ehkim/anaconda3/envs/myenv/lib/python3.12/site-packages/langchain/_api/module_import.py:92: LangChainDeprecationWarning: Importing RequestsWrapper from langchain is deprecated. Please replace deprecated imports:
from langchain import RequestsWrapper
with new imports of:
from langchain_community.utilities import RequestsWrapper
You can use the langchain cli to automatically upgrade many imports. Please see documentation here https://python.langchain.com/v0.2/docs/versions/v0_2/
warn_deprecated(
Traceback (most recent call last):
File "/home/ehkim/git/testprj/code_snippet/openapi-agent.py", line 23, in
swagger_api_spec = reduce_openapi_spec(data)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ehkim/anaconda3/envs/myenv/lib/python3.12/site-packages/langchain_community/agent_toolkits/openapi/spec.py", line 53, in reduce_openapi_spec
(name, description, dereference_refs(docs, full_schema=spec))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ehkim/anaconda3/envs/myenv/lib/python3.12/site-packages/langchain_core/utils/json_schema.py", line 108, in dereference_refs
else _infer_skip_keys(schema_obj, full_schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ehkim/anaconda3/envs/myenv/lib/python3.12/site-packages/langchain_core/utils/json_schema.py", line 80, in _infer_skip_keys
keys += _infer_skip_keys(v, full_schema, processed_refs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ehkim/anaconda3/envs/myenv/lib/python3.12/site-packages/langchain_core/utils/json_schema.py", line 80, in _infer_skip_keys
keys += _infer_skip_keys(v, full_schema, processed_refs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ehkim/anaconda3/envs/myenv/lib/python3.12/site-packages/langchain_core/utils/json_schema.py", line 76, in _infer_skip_keys
ref = _retrieve_ref(v, full_schema)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/ehkim/anaconda3/envs/myenv/lib/python3.12/site-packages/langchain_core/utils/json_schema.py", line 17, in _retrieve_ref
out = out[int(component)]
~~~^^^^^^^^^^^^^^^^
KeyError: 400
Description
I'm trying to use the langchain library to execute the OpenAPI Agent and fully interpret an OpenAPI specification using the reduce_openapi_spec function in my script.
I expect the agent to execute normally without any errors.
Instead, it raises a KeyError: 400.
The text was updated successfully, but these errors were encountered:
dosubotbot
added
Ɑ: agent
Related to agents module
🤖:bug
Related to a bug, vulnerability, unexpected error with an existing feature
labels
Jul 17, 2024
… 400 in JSON schema processing) (#24337)
Description:
This PR fixes a KeyError: 400 that occurs in the JSON schema processing
within the reduce_openapi_spec function. The _retrieve_ref function in
json_schema.py was modified to handle missing components gracefully by
continuing to the next component if the current one is not found. This
ensures that the OpenAPI specification is fully interpreted and the
agent executes without errors.
Issue:
Fixes issue #24335
Dependencies:
No additional dependencies are required for this change.
Twitter handle:
@lunara_x
olgamurraft
pushed a commit
to olgamurraft/langchain
that referenced
this issue
Aug 16, 2024
… KeyError: 400 in JSON schema processing) (langchain-ai#24337)
Description:
This PR fixes a KeyError: 400 that occurs in the JSON schema processing
within the reduce_openapi_spec function. The _retrieve_ref function in
json_schema.py was modified to handle missing components gracefully by
continuing to the next component if the current one is not found. This
ensures that the OpenAPI specification is fully interpreted and the
agent executes without errors.
Issue:
Fixes issue langchain-ai#24335
Dependencies:
No additional dependencies are required for this change.
Twitter handle:
@lunara_x
dosubotbot
added
the
stale
Issue has not had recent activity or appears to be solved. Stale issues will be automatically closed
label
Oct 16, 2024
Checked other resources
Example Code
In this file, I used swagger.yaml file from https://superset.demo.datahubproject.io/api/v1/_openapi.
It's json format, so I converted it with below code.
Error Message and Stack Trace (if applicable)
(myenv) ehkim@ehkim-400TEA-400SEA:~/git/testprj/code_snippet$ python openapi-agent.py
/home/ehkim/anaconda3/envs/myenv/lib/python3.12/site-packages/langchain/_api/module_import.py:92: LangChainDeprecationWarning: Importing RequestsWrapper from langchain is deprecated. Please replace deprecated imports:
with new imports of:
Description
I'm trying to use the langchain library to execute the OpenAPI Agent and fully interpret an OpenAPI specification using the reduce_openapi_spec function in my script.
I expect the agent to execute normally without any errors.
Instead, it raises a KeyError: 400.
System Info
(myenv) ehkim@ehkim-400TEA-400SEA:~/git/testprj/code_snippet$ pip freeze | grep langchain
langchain==0.2.8
langchain-cli==0.0.21
langchain-community==0.2.7
langchain-core==0.2.20
langchain-experimental==0.0.37
langchain-google-vertexai==0.0.3
langchain-openai==0.1.16
langchain-robocorp==0.0.3
langchain-text-splitters==0.2.2
langchainhub==0.1.15
The text was updated successfully, but these errors were encountered: