Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Pydantic v2 #893

Closed
srhinos opened this issue Jul 27, 2023 · 14 comments · Fixed by #1174
Closed

Support Pydantic v2 #893

srhinos opened this issue Jul 27, 2023 · 14 comments · Fixed by #1174
Assignees

Comments

@srhinos
Copy link

srhinos commented Jul 27, 2023

Pydantic has gone live with their new version

Being that its production ready and as suggested by Pydantic's README:

Pydantic V2 also ships with the latest version of Pydantic V1 built in so that you can incrementally upgrade your code base and projects: from pydantic import v1 as pydantic_v1.

It should be safe to upgrade and allow folks using poetry (like me) to resolve dependancies in a way that unblocks upgrading to Pydantic v2, FastAPI v0.100.0, etc

@jeffchuber
Copy link
Contributor

@srhinos thanks! we'd like to do this, but it's not a huge priority for the core team right now. I think it's a pretty heavy lift, but we would welcome a PR if you are up for it.

@srhinos
Copy link
Author

srhinos commented Jul 27, 2023

Yeah, I can give it a crack and see how it goes. I’d imagine a bunch of downstream dependencies may depend on a hard pinned pydantic v1 so unsure if I wanna quarterback this for another large ML project but if I have time this weekend I’ll dig around and report back my spike!

Would y'all be open to a pr with a super minor change that’ll unpin pydantic and lean on a try/except import of the new v1 alias in v2 that is basically pydantic v1.X in v2?

This might let me avoid having to do a bunch of dependency upgrading before my proposed PR could merge.

@jeffchuber
Copy link
Contributor

@srhinos we would be open to that!

@cachho
Copy link

cachho commented Aug 4, 2023

I support this for compatibility with Supabase-py (I personally use it for logging, feedback and to store sources for my QnA bot).

@ehartford
Copy link

this is really blocking because llama-cpp-python uses pydantic 2.0.1 and then chroma won't run in the same environment as llama-cpp-python, it is crippling.

@srhinos
Copy link
Author

srhinos commented Sep 16, 2023

this is really blocking because llama-cpp-python uses pydantic 2.0.1 and then chroma won't run in the same environment as llama-cpp-python, it is crippling.

Agreed :/ the PR has been open for a bit and I'm not knowledgeable on how chroma is using FastAPI internally to make the necessary changes to fix the issues I ran into. This will need a maintainer's eyes to either advise me on how to fix the test errors or some added commits on my open PR.

@HammadB
Copy link
Collaborator

HammadB commented Sep 21, 2023

Hi folks, thanks for the patience here. I agree this is important to unblock. I had some cycles to spend on this today.

Here are my thoughts.

Pydantic v2 introduced breaking, non backwards compatible changes to their API in 2.0. FastAPI adopted these changes in >=0.100.

Many of our users would like to be able to use a fastapi >= 0.100 AND/OR pydantic >= 2.0. We don't really need any of the performance or functionality changes that come with each release, but we'd like to be able to support these users. Projects like llama python use pydantic >= 2.0.

Pydantics way to support backwards compatibility, is to bundle a "pydantic.v1" namespace which holds the old version. One would expect a naive solution to this to be to upgrade to pydantic 2, switch our imports to pydantic.v1 (with error fallbacks) and be off to the races. Alas, this is not the case.

FastAPI claimed to be able to support pydantic v1 AND v2 in >= 0.100, but it is not capable of doing so at runtime (fastapi/fastapi#9966). It statically determines if you are in V1 mode or V2 mode based solely on the package that is installed. So imports from pydantic.v1 will not work out of the box. This leaves options, which fundamentally trade complexity on our side, for potential inconvenience to various user cohorts.

  1. We make our min fastapi version 0.100.0, upgrade to pydantic v2 and it all works after some migration to our code. User impact: However, users who want to use fastapi < 0.100.0 will not be able to do so without upgrading and users who currently use pydantic v1 will have to either migrate their code OR change their imports to pydantic.v1.
  2. We make our min fastapi version 0.100.0, upgrade to pydantic v2 and change our imports to pydantic.v1. Then we patch fastapi to use its pydantic v1 paths. (Unclear if possible).
    3. We allow fastapi < 0.100.0 OR greater and then upgrade to pydantic v2. We try/catch pydantic.v1 imports and patch fastapi to correctly use pydantic.v1 imports. However, this patching could break users who also use fastapi in their projects in other ways with pydantic v2 imports. (More research is needed) UPDATE: This won't work as the deps won't resolve.
  3. We restrict fastapi to < 0.100.0 as we do now, but upgrade to pydantic v2, while changing our imports to pydantic.v1. Any one on old pydantic will have to upgrade their imports to pydantic.v1, and anyone on fastapi >= 0.100.0 will not be able to use chroma.

There are perhaps a couple other options but these ones are the main ones I see. I do not think it is possible to support the 2x2 of fastapi below 0.100.0 and above, as well as pydantic above v2 and below in a reasonable way.

@HammadB
Copy link
Collaborator

HammadB commented Sep 21, 2023

@samuelcolvin @srhinos If you have any thoughts it would be appreciated!

@srhinos
Copy link
Author

srhinos commented Sep 22, 2023

@HammadB honestly I think there's a kinda hacky method that'll address this while still allowing v1 and v2 support on pre and post 0.100.0.

I felt that moving the pydantic models that are causing issues into a separate file ie chromadb/fastapi/main.py:pydantic_model -> chromadb/fastapi/schema.py:pydantic_model_v1 / pydantic_model_v2 would then let us import the correct model as pydantic_model using the same strategy we're doing for importing v1 from pydanic v2 or failing back to actual v1 models.

Maybe this strategy might work? I'm not really too sure the extent of this bug within fastapi preventing v1 models within >0.100.0 but it'd at least let us maintain backwards compatibility.

@HammadB
Copy link
Collaborator

HammadB commented Sep 22, 2023

That's worth a shot. I don't love the maintainability but we rarely mutate the API surface so that might be OK. Maximizing the compatibility of chroma is important.

Thanks for the idea. Let me try that.

@HammadB
Copy link
Collaborator

HammadB commented Sep 22, 2023

Ok found a much simpler solution. That covers 3/4 cases. I'm not super worried about supporting #1.

  1. Fastapi < 0.100, Pydantic >= 2.0 - Unsupported as the fastapi dependencies will not allow it. They likely should, as pydantic.v1 imports would support this, but this is a downstream issue.
  2. Fastapi >= 0.100, Pydantic >= 2.0, Supported via normal imports ✅ (Tested with fastapi==0.103.1, pydantic==2.3.0)
  3. Fastapi < 0.100 Pydantic < 2.0, Supported via normal imports ✅ (Tested with fastapi==0.95.2, pydantic==1.9.2)
  4. Fastapi >= 0.100, Pydantic < 2.0, Supported via normal imports ✅ (Tested with latest fastapi, pydantic==1.9.2)

Only needed change was how we import BaseSettings and changing how the Collection.py model initializes, and some test fixes.

@srhinos
Copy link
Author

srhinos commented Sep 23, 2023

Nice! Wanna add the changes to my open PR and we can get it merged in?

@HammadB
Copy link
Collaborator

HammadB commented Sep 23, 2023

The changes are in #1174 referenced above. Should get released Monday

HammadB added a commit that referenced this issue Sep 25, 2023
## Description of changes
Closes #893 

*Summarize the changes made by this PR.*
 - Improvements & Bug fixes
- Adds support for pydantic v2.0 by changing how Collection model inits
- this simple change fixes pydantic v2
	 - Fixes the cross version tests to handle pydantic specifically
- Conditionally imports pydantic-settings based on what is available. In
v2 BaseSettings was moved to a new package.
 - New functionality
	 - N/A

## Test plan
Existing tests were run with the following configs
1. Fastapi < 0.100, Pydantic >= 2.0 - Unsupported as the fastapi
dependencies will not allow it. They likely should, as pydantic.v1
imports would support this, but this is a downstream issue.
2. Fastapi >= 0.100, Pydantic >= 2.0, Supported via normal imports ✅
(Tested with fastapi==0.103.1, pydantic==2.3.0)
3. Fastapi < 0.100 Pydantic < 2.0, Supported via normal imports ✅
(Tested with fastapi==0.95.2, pydantic==1.9.2)
4. Fastapi >= 0.100, Pydantic < 2.0, Supported via normal imports ✅
(Tested with latest fastapi, pydantic==1.9.2)

- [x] Tests pass locally with `pytest` for python, `yarn test` for js

## Documentation Changes
None required.
@HammadB
Copy link
Collaborator

HammadB commented Sep 25, 2023

This should be fixed now in 0.4.13

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants