How to use the output of one tool as input for another tool #733
-
I want to know how I can use the result of one tool as input for another tool. Below is my current source code: llm = Langchain::LLM::OpenAI.new(
api_key: "na",
llm_options: {uri_base: "http://llm.ollama.demo:11434/", log_errors: true},
default_options: {chat_completion_model_name: "llama3.1"}
)
thread = Langchain::Thread.new
assistant = Langchain::Assistant.new(
llm: llm,
thread: thread,
instructions: "You are a smart general assistant able to use tools if needed.",
tools: [Langchain::Tool::FileSystem.new, Langchain::Tool::Wikipedia.new]
)
user_message = "Write a summary about Steve Jobs and write it to summary.md"
assistant.add_message_and_run(content: user_message, auto_tool_execution: true) The LLM returns two tool calls. However, How can I ensure that the output from the Wikipedia tool is used as input for the FileSystem tool? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 8 replies
-
@eliasfroehner Is using a local LLM a deal breaker for you? I just tried it out with OpenAI and it worked. |
Beta Was this translation helpful? Give feedback.
-
I strung the 2 assistants (research assistant + file system assistant) together and it worked. # This worked better than the OpenAI class using the Ollama URL
llm = Langchain::LLM::Ollama.new # default is chat_completion_model_name: "llama3.1"
# Instantiate researcher_assistant
researcher_assistant = Langchain::Assistant.new(
llm: llm,
# No need to pass a Langchain::Thread.new, it gets auto-instantiated.
instructions: "You are a smart general researcher that uses Wikipedia to write summaries on various topics.",
tools: [Langchain::Tool::Wikipedia.new]
)
# Instantiate fs_assistant
fs_assistant = Langchain::Assistant.new(
llm: llm,
instructions: "You are a file system assistant.",
tools: [Langchain::Tool::FileSystem.new]
)
# Invoke the researcher_assistant
messages = researcher_assistant.add_message_and_run(
content: "Write a summary about Steve Jobs",
auto_tool_execution: true
)
# If successful, last message should be the summary
summary = messages.last.content if researcher_assistant.state == :completed
# Feed this into the fs_assistant
messages = fs_assistant.add_message_and_run(
content: "Write this to summary.md: #{summary}",
auto_tool_execution: true
)
# Inspect the last message
messages.last Would you please give this a spin? I'm still trying to figure out the best approach to build these workflows togerher. |
Beta Was this translation helpful? Give feedback.
I strung the 2 assistants (research assistant + file system assistant) together and it worked.