Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to use headers in chat ollama #6631

Closed
5 tasks done
ronneldavis opened this issue Aug 26, 2024 · 5 comments · Fixed by #7052
Closed
5 tasks done

Unable to use headers in chat ollama #6631

ronneldavis opened this issue Aug 26, 2024 · 5 comments · Fixed by #7052
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature

Comments

@ronneldavis
Copy link

ronneldavis commented Aug 26, 2024

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain.js documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain.js rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

import { ChatOllama } from "@langchain/ollama";
let model = new ChatOllama({
      baseUrl: "some-url",
      model: model,
      temperature: 0.1,
      numCtx: 16384,
      keepAlive: "60m",
      headers: {

      }
});

Error Message and Stack Trace (if applicable)

No response

Description

I am trying to set headers in chat ollama, I see that it had been fixed for the community library but since splitting out chat ollama, I don't see the option to set custom headers anymore.

Relevant links:

System Info

"langchain": "^0.2.11",
"@langchain/anthropic": "^0.2.13",
"@langchain/core": "0.2.22",
"@langchain/google-genai": "^0.0.25",
"@langchain/google-vertexai": "^0.0.21",
"@langchain/groq": "^0.0.16",
"@langchain/langgraph": "^0.0.33",
"@langchain/ollama": "^0.0.4",
"@langchain/openai": "^0.2.6"
@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Aug 26, 2024
Copy link

dosubot bot commented Aug 26, 2024

I found some similar issues that might help you:

  1. Currently, it is not possible to pass headers when creating a RemoteRunnable - This issue is closed, and the solution is now documented here [1].

  2. Custom headers for Ollama / ChatOllama - This issue was closed by a pull request that allows custom headers on Ollama / ChatOllama / OllamaEmbeddings [2].

To set custom headers in ChatOllama using LangChain.js, you can follow the approach shown below. The OllamaCallOptions interface supports a headers property where you can specify your custom headers.

Here is an example:

import { ChatOllama } from "@langchain/ollama";

const model = new ChatOllama({
  model: "your-model-name",
  maxTokens: 1024,
  clientOptions: {
    headers: {
      "X-Api-Key": process.env.OLLAMA_API_KEY,
      "Custom-Header": "YourCustomHeaderValue",
    },
  },
});

const res = await model.invoke("Your query here");

console.log(res);

In this code snippet, replace "your-model-name" with the actual model name you are using and add any custom headers you need in the headers object. This will ensure that your custom headers are included in the requests made by the ChatOllama instance [3][4].

To continue talking to Dosu, mention @dosu.

@langchain-ai langchain-ai deleted a comment Aug 26, 2024
@bracesproul
Copy link
Member

It appears the Ollama SDK doesn't have support for passing headers through to the requests. I've sent a message to the Ollama team asking about the best way to go about this, and will update you when I hear back.

@ronneldavis
Copy link
Author

Thanks @bracesproul! Looking forward to hearing back from you 👍🏼

@ronneldavis
Copy link
Author

@bracesproul I have been able to create a PR for ollama js that adds optional headers: ollama/ollama-js#138

@sdorra
Copy link

sdorra commented Sep 10, 2024

@bracesproul the pr is now merged and the version v0.5.9 includes the change.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants