Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpenAI provider not accessilbe #1082

Closed
haesleinhuepf opened this issue Nov 2, 2024 · 6 comments · Fixed by #1087
Closed

OpenAI provider not accessilbe #1082

haesleinhuepf opened this issue Nov 2, 2024 · 6 comments · Fixed by #1087
Labels
bug Something isn't working

Comments

@haesleinhuepf
Copy link

Description

I would like to use OpenAI's GPT4/o or Anthropic's Claude models for Jupyter AI and cannot find how to do this.

Reproduce

I just installed pip install jupyter-ai

After starting Jupyter lab, GPT4 / Claude is not in the list of models. I also searched the documentation and couldn*t find anything.

image

Expected behavior

I would be great if the pulldown or "%ai list" contain the models which are available to me (I do have environment variables for OPENAI_API_KEY and ANTHROPIC_API_KEY set)

Context

  • Operating System and version: Windows 11
  • Browser and version:
  • JupyterLab version: Version 4.2.5

Thanks!

@haesleinhuepf haesleinhuepf added the bug Something isn't working label Nov 2, 2024
@srdas
Copy link
Collaborator

srdas commented Nov 2, 2024

You can install all dependencies using

pip install jupyter-ai[all]

or in addition to your current install also run

pip install langchain-openai
pip install langchain-anthropic

After these installs you should see the OpenAI and Anthropic models in the drop down list. If you are using Claude models on AWS Bedrock, you must also install

pip install langchain-aws

@haesleinhuepf
Copy link
Author

Ok cool, thanks @srdas for the feedback! Feel free to close this issue or consider updating the documentation. Currently neither the installation isntructions here nor here explain this clearly. Is is not obvious that we need langchain modules. I had openai and anthropic installed, but not langchaing-openai and langchain-anthropic... Thanks again!

@mcavdar
Copy link
Contributor

mcavdar commented Nov 2, 2024

hi,
actually the latest version of langchain-openai introduces a dependency issue, similar to #1017 (comment).

Installing collected packages: tqdm, regex, jiter, distro, tiktoken, openai, langchain-core, langchain-openai
  Attempting uninstall: langchain-core
    Found existing installation: langchain-core 0.2.43
    Uninstalling langchain-core-0.2.43:
      Successfully uninstalled langchain-core-0.2.43
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
langchain 0.2.16 requires langchain-core<0.3.0,>=0.2.38, but you have langchain-core 0.3.15 which is incompatible.
langchain-community 0.2.17 requires langchain-core<0.3.0,>=0.2.39, but you have langchain-core 0.3.15 which is incompatible.
langchain-text-splitters 0.2.4 requires langchain-core<0.3.0,>=0.2.38, but you have langchain-core 0.3.15 which is incompatible.
Successfully installed distro-1.9.0 jiter-0.7.0 langchain-core-0.3.15 langchain-openai-0.2.5 openai-1.53.0 regex-2024.9.11 tiktoken-0.8.0 tqdm-4.66.6

You can install old version of it:
pip install langchain-openai==0.1.23

@dlqqq
Copy link
Member

dlqqq commented Nov 2, 2024

@haesleinhuepf

Is is not obvious that we need langchain modules.

Thank you for reporting this. Our team has been very busy developing features, building Jupyter AI v3, and fixing bugs, and as a result there are definitely some gaps in our user documentation. I'll see if we can prioritize updating the user docs soon to prevent others from running into the same issue.

@mcavdar

actually the latest version of langchain-openai introduces a dependency issue

This is an unfortunate consequence of how pip install resolves a dependency upon installation. We recommend using conda install to avoid breaking your local environment.

@mcavdar
Copy link
Contributor

mcavdar commented Nov 3, 2024

hi @dlqqq
Conda might be useful for the OpenAI case, but it doesn't seem to apply to Ollama. :/
https://anaconda.org/search?q=langchain-openai
https://anaconda.org/search?q=langchain-ollama

Apologize if I'm overlooking anything; I'm not very familiar with Conda.

@krassowski
Copy link
Member

Just to mention that a more comprehensive solution would be to address:

There has been several documentation updates previously for example in response to #958 but the docs can only go so far. Otherwise, #1087 is just a déjà vu of #961.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants