-
Notifications
You must be signed in to change notification settings - Fork 2.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ChatOpenAI JSON mode errors in agent chain: llm.predictMessages is not a function #3344
Comments
🤖 Hello @gramliu, I'm an AI bot designed to assist you with bug fixes, answering questions, and guiding you through your contributions. While we're waiting for a human maintainer, I'm here to help. Please know that I'll be taking a closer look at the issue you've reported and will provide you with a thorough answer soon. |
🤖 Hello @gramliu, I'm an AI bot here to help you with your questions, issues, and contributions to the langchainjs repository while we wait for a human maintainer. Let's work together to resolve your issue! Based on the code snippets and error message you've provided, it seems like the issue is arising because the In the To resolve this issue, you could consider implementing a async predictMessages(
messages: BaseMessage[],
options?: string[] | CallOptions,
callbacks?: Callbacks
): Promise<BaseMessage> {
// Convert the messages to a prompt string
const prompt = messages.map(message => message.content).join('\n');
// Call the OpenAI API
const response = await this._call(prompt, options, callbacks);
// Convert the response to a BaseMessage instance
const message = new BaseMessage({ content: response });
return message;
} Please note that this is a simplified example and you might need to adjust it according to your specific needs and the exact structure of your If you're not comfortable making these changes yourself, I would recommend reaching out to the maintainers of the langchainjs library and asking them to implement this feature. You could do this by opening a new issue in the langchainjs repository and describing the problem and your proposed solution. SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
I'm binding a
ChatOpenAI
model with JSON mode as specified in the docs:I'm then using this model in a chain as part of an agent:
After calling the executor, however, I'm seeing this error:
TypeError: llm.predictMessages is not a function
.I found this issue (#3226) and its fix (#3200) which patches a similar error for
llm.callKeys
, but it doesn't look like thepredictMessages
call in the agent class was handled yet.The text was updated successfully, but these errors were encountered: