Skip to content

How to use the output of one tool as input for another tool #733

Closed Answered by andreibondarev
eliasfroehner asked this question in Q&A
Discussion options

You must be logged in to vote

I strung the 2 assistants (research assistant + file system assistant) together and it worked.

# This worked better than the OpenAI class using the Ollama URL
llm = Langchain::LLM::Ollama.new # default is chat_completion_model_name: "llama3.1"

# Instantiate researcher_assistant
researcher_assistant = Langchain::Assistant.new(
  llm: llm,
  # No need to pass a Langchain::Thread.new, it gets auto-instantiated.
  instructions: "You are a smart general researcher that uses Wikipedia to write summaries on various topics.",
  tools: [Langchain::Tool::Wikipedia.new]
)

# Instantiate fs_assistant
fs_assistant = Langchain::Assistant.new(
  llm: llm,
  instructions: "You are a file system assistant."

Replies: 2 comments 8 replies

Comment options

You must be logged in to vote
3 replies
@eliasfroehner
Comment options

@andreibondarev
Comment options

@eliasfroehner
Comment options

Comment options

You must be logged in to vote
5 replies
@eliasfroehner
Comment options

@andreibondarev
Comment options

@eliasfroehner
Comment options

@andreibondarev
Comment options

@eliasfroehner
Comment options

Answer selected by eliasfroehner
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants