-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to use headers in chat ollama #6631
Comments
I found some similar issues that might help you:
To set custom headers in Here is an example: import { ChatOllama } from "@langchain/ollama";
const model = new ChatOllama({
model: "your-model-name",
maxTokens: 1024,
clientOptions: {
headers: {
"X-Api-Key": process.env.OLLAMA_API_KEY,
"Custom-Header": "YourCustomHeaderValue",
},
},
});
const res = await model.invoke("Your query here");
console.log(res); In this code snippet, replace |
It appears the Ollama SDK doesn't have support for passing headers through to the requests. I've sent a message to the Ollama team asking about the best way to go about this, and will update you when I hear back. |
Thanks @bracesproul! Looking forward to hearing back from you 👍🏼 |
@bracesproul I have been able to create a PR for ollama js that adds optional headers: ollama/ollama-js#138 |
@bracesproul the pr is now merged and the version v0.5.9 includes the change. |
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
No response
Description
I am trying to set headers in chat ollama, I see that it had been fixed for the community library but since splitting out chat ollama, I don't see the option to set custom headers anymore.
Relevant links:
System Info
The text was updated successfully, but these errors were encountered: