You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Deployment is successful, but when calling the ReasoningEngine, an InternalServerError occurs with the error message: "create_app() takes 0 positional arguments but 1 was given." This error happens when invoking the agent after creating it using remote_agent.
I am setting up a multi-agent system using create_react_agent as follows:
from langgraph.prebuilt import create_react_agent
tools = [get_weather]
agent1 = create_react_agent(model=llm, state_modifier=format_for_model, tools=tools, state_schema=AgentState)
For deployment, the following code is used:
remote_agent = reasoning_engines.ReasoningEngine.create(
SampleAgent(project=PROJECT_ID, location=LOCATION),
requirements=[
"google-cloud-aiplatform[langchain,reasoningengine]",
"cloudpickle==3.1.0",
"pydantic",
"langgraph",
"langchain-google-community",
"google-cloud-discoveryengine"
],
display_name=INSTANCE_NAME,
description="test langgraph toolnode",
extra_packages=[
"prompt.yaml"
],
)
remote_agent.query(message=message, chat_id="demo_1")
Relevant log output
Log ERROR:
`DEFAULT 2024-12-16T16:39:27.223290Z INFO: Application startup complete.DEFAULT 2024-12-16T16:39:27.223330Z INFO: Application startup complete.DEFAULT 2024-12-16T16:39:27.223422Z INFO: ASGI 'lifespan' protocol appears unsupported.DEFAULT 2024-12-16T16:39:27.223535Z INFO: Application startup complete.DEFAULT 2024-12-16T16:39:27.224799Z INFO: Started server process [7]DEFAULT 2024-12-16T16:39:27.224859Z INFO: Waiting for application startup.DEFAULT 2024-12-16T16:39:27.224984Z INFO: ASGI 'lifespan' protocol appears unsupported.DEFAULT 2024-12-16T16:39:27.225038Z INFO: Application startup complete.POST /api/reasoning_engine HTTP/1.1" 500 Internal Server Error"create_app() takes 0 positional arguments but 1 were given .`STACK TRACE:`---------------------------------------------------------------------------_InactiveRpcError Traceback (most recent call last)File \myenv\Lib\site-packages\google\api_core\grpc_helpers.py:76, in _wrap_unary_errors.<locals>.error_remapped_callable(*args, **kwargs) 75 try:---> 76 return callable_(*args, **kwargs) 77 except grpc.RpcError as exc:File \myenv\Lib\site-packages\grpc\_channel.py:1181, in _UnaryUnaryMultiCallable.__call__(self, request, timeout, metadata, credentials, wait_for_ready, compression) 1175 ( 1176 state, 1177 call, 1178 ) = self._blocking( 1179 request, timeout, metadata, credentials, wait_for_ready, compression 1180 )-> 1181 return _end_unary_response_blocking(state, call, False, None)File \myenv\Lib\site-packages\grpc\_channel.py:1006, in _end_unary_response_blocking(state, call, with_call, deadline) 1005 else:-> 1006 raise _InactiveRpcError(state)_InactiveRpcError: <_InactiveRpcError of RPC that terminated with: status = StatusCode.FAILED_PRECONDITION details = "Reasoning Engine Execution failed.Please navigate to the Cloud Console Log Explorer page (https://console.cloud.google.com/logs/query;query=resource.type="aiplatform.googleapis.com%2FReasoningEngine"%0Aresource.labels.reasoning_engine_id=~""?project=) to view the specific errors. Additionally, please check our troubleshooting guide (https://cloud.google.com/vertex-ai/generative-ai/docs/reasoning-engine/troubleshooting/use) for more information, or visit our Cloud community forum (https://www.googlecloudcommunity.com/gc/AI-ML/bd-p/cloud-ai-ml) or GitHub repository (https://github.com/googleapis/python-aiplatform/issues) to find answers and ask for help.Error Details: Internal Server Error" debug_error_string = "UNKNOWN:Error received from peer ipv6:%5%5D:43 {grpc_message:"Reasoning Engine Execution failed.\nPlease navigate to the Cloud Console Log Explorer page (https://console.cloud.google.com/logs/query;query=resource.type=\"aiplatform.googleapis.com%2FReasoningEngine\"%0Aresource.labels.reasoning_engine_id=~\"\"?project=) to view the specific errors. Additionally, please check our troubleshooting guide (https://cloud.google.com/vertex-ai/generative-ai/docs/reasoning-engine/troubleshooting/use) for more information, or visit our Cloud community forum (https://www.googlecloudcommunity.com/gc/AI-ML/bd-p/cloud-ai-ml) or GitHub repository (https://github.com/googleapis/python-aiplatform/issues) to find answers and ask for help.\nError Details: Internal Server Error", grpc_status:9, created_time:"2024-12-16T16:05:39.6172289+00:00"}">The above exception was the direct cause of the following exception:FailedPrecondition Traceback (most recent call last)Cell In[119], line 2 1 message = "What is weather today?"----> 2 response=remote_agent.query(message=message, chat_id="demo_1") 3 display(Markdown(response))File \myenv\Lib\site-packages\vertexai\reasoning_engines\_reasoning_engines.py:763, in _wrap_query_operation.<locals>._method(self, **kwargs) 762 def _method(self, **kwargs) -> _utils.JsonDict:--> 763 response = self.execution_api_client.query_reasoning_engine( 764 request=aip_types.QueryReasoningEngineRequest( 765 name=self.resource_name, 766 input=kwargs, 767 class_method=method_name, 768 ), 769 ) 770 output = _utils.to_dict(response) 771 return output.get("output", output)File \myenv\Lib\site-packages\google\cloud\aiplatform_v1beta1\services\reasoning_engine_execution_service\client.py:795, in ReasoningEngineExecutionServiceClient.query_reasoning_engine(self, request, retry, timeout, metadata) 792 self._validate_universe_domain() 794 # Send the request.--> 795 response = rpc( 796 request, 797 retry=retry, 798 timeout=timeout, 799 metadata=metadata, 800 ) 802 # Done; return the response. 803 return responseFile \myenv\Lib\site-packages\google\api_core\gapic_v1\method.py:131, in _GapicCallable.__call__(self, timeout, retry, compression, *args, **kwargs) 128 if self._compression is not None: 129 kwargs["compression"] = compression--> 131 return wrapped_func(*args, **kwargs)File \myenv\Lib\site-packages\google\api_core\grpc_helpers.py:78, in _wrap_unary_errors.<locals>.error_remapped_callable(*args, **kwargs) 76 return callable_(*args, **kwargs) 77 except grpc.RpcError as exc:---> 78 raise exceptions.from_grpc_error(exc) from exc`
Code of Conduct
I agree to follow this project's Code of Conduct
The text was updated successfully, but these errors were encountered:
Thanks for opening this issue. It appears that you're passing remote_agent.query(message=message), whereas the kwarg to the query method should be remote_agent.query(input=message). And there is not a chat_id kwarg, but rather you can pass a unique session_id via the config kwarg. Then your resulting code will look something like:
File Name
N/A
What happened?
Deployment is successful, but when calling the ReasoningEngine, an InternalServerError occurs with the error message: "create_app() takes 0 positional arguments but 1 was given." This error happens when invoking the agent after creating it using remote_agent.
I am setting up a multi-agent system using create_react_agent as follows:
from langgraph.prebuilt import create_react_agent
tools = [get_weather]
agent1 = create_react_agent(model=llm, state_modifier=format_for_model, tools=tools, state_schema=AgentState)
For deployment, the following code is used:
remote_agent = reasoning_engines.ReasoningEngine.create(
SampleAgent(project=PROJECT_ID, location=LOCATION),
requirements=[
"google-cloud-aiplatform[langchain,reasoningengine]",
"cloudpickle==3.1.0",
"pydantic",
"langgraph",
"langchain-google-community",
"google-cloud-discoveryengine"
],
display_name=INSTANCE_NAME,
description="test langgraph toolnode",
extra_packages=[
"prompt.yaml"
],
)
remote_agent.query(message=message, chat_id="demo_1")
Relevant log output
Code of Conduct
The text was updated successfully, but these errors were encountered: