-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow using intra-monorepo python dependencies without publishing them to pypi #6455
Comments
|
After some more tinkering, I think I can extend the above idea into a fully functional solution. Longer write-up coming asap. |
@Phlair that sounds very intriguing and definitely could worK ( I believe we used this for the forked file connector). Do you want to propose a spike/experiment for this in backlog grooming tomorrow? |
Yeah for sure! I'll write up where I'm at with it so far in this doc before grooming. |
Closing in favour of https://github.com/airbytehq/airbyte-internal-issues/issues/561 |
Tell us about the problem you're trying to solve
Today it is not easy to use intra-repo python dependencies in published docker images because our build system assumes that the Docker build context of a particular connector is everything under the directory of that connector.
For example, let's say
airbyte-integrations/source-s3
depends on a helper libraryairbyte-integrations/source-abstract-blobs
. We could theoretically go tosource-s3/requirements.txt
and insert a line like-e ../source-abstract-blobs
and runpip install -r requirements.txt
to get that dependency and all would work well locally.However, this is more difficult to do when buildling
source-s3
's docker image because using something likepip install -e
creates a symlink. But if you look at the directory structure,source-s3/Dockerfile
's build context does not contain the file pointed to by that symlink i.e: the symlink will point toairbyte-integrations/source-abstract-blobs
which is outside theairbyte-integrations/source-s3
directory.Describe the solution you’d like
I want to be able to re-use python packages in other python packages inside the monorepo without having to publish them to pypi.
There's a few things we could do to get around this:
COPY ./file ./file
you have to sayCOPY ./airbyte-integrations/source-s3/file ./file
. This is pretty unintuitive and creates a fork between the logic a developer runs to test their connector and the one run in CI.pip install -t
which would "copy" the package into the venv directory. This could work with some caveats:pip install -t
etc... before running anydocker
commands so the.venv
directory is populated with all the right packages.venv
directory from the build context. This is workable as it is the current approach we use for java packages.pip install
runs duringdocker build
it will have access to any intra-monorepo dependencies via their presence in the.venv
directory. However, we need to be super careful that this creates reproducible builds (holding the git commit fixed).The text was updated successfully, but these errors were encountered: