Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

rules_python does not respect the namespace_packages= argument of setuptools.setup(). #93

Closed
mishas opened this issue Apr 11, 2018 · 16 comments
Assignees

Comments

@mishas
Copy link

mishas commented Apr 11, 2018

(more information here: http://setuptools.readthedocs.io/en/latest/setuptools.html#namespace-packages)

There are packages, that have the same namespace in their name. Trying to use them via rules_python fails on ImportError: No module named '...'.

For example, try installing this package: testing.postgresql. It has another package testing.common.database as its dependency. Now rules_python installs those separately without taking the namespace_packages=['testing'] argument into account, which creates a situation where during import testing.postgresql, the program fails with ImportError: No module named 'testing.postgresql', due to it's stopping the search after finding the first "testing" directory in the path, which belongs to testing.common.database.

Possible solution: pushing an init.py file declaring a namespace (having the content: __path__ = __import__("pkgutil").extend_path(__path__, __name__)) instead of an empty file to folders that should be namespaces.

@calder
Copy link
Contributor

calder commented Jun 26, 2018

Is anyone working on fixing this? This makes it impossible to use google-auth and protobuf in the same py_binary.

@lberki
Copy link
Collaborator

lberki commented Jul 4, 2018

/cc @brandjon

@lberki lberki assigned brandjon and unassigned davidstanke and lberki Jul 4, 2018
@daicoden
Copy link

daicoden commented Aug 30, 2018

+1 ran into this issue too.

I think the suggested __path__ = __import__("pkgutil").extend_path(__path__, __name__), will work - here's the workaround I came up with which might fix your issue @mishas. I can see the proper files on disk, with the__path__ in place.

# defs.bzl somewhere
def _py_namespace_impl(ctx):
    srcs_with_namespace = []
    symlinks = {}
    namespace_directory = ""

    for file in ctx.files.srcs:
        parts = file.path.split(ctx.attr.namespace)
        if len(parts) > 1:
            if namespace_directory == "":
                namespace_directory = ctx.attr.namespace.join(parts[0:-1])

            elif namespace_directory != ctx.attr.namespace.join(parts[0:-1]):
                fail("The srcs packages are too complex, use multiple py_namespace rules to break it up")

            if parts[-1] == "__init__":
                # Don't add it to the srcs_with_namespace, instead we're going to insert our own before
                pass
            else:
                srcs_with_namespace += [file]
                # Eh, can't get this to work... symlinks don't get created :(
                # symlinks[file.path] = file

        else:
            # File not related to namespace
            srcs_with_namespace += [file]
            # symlinks[file.path] = file


    if namespace_directory == "":
        fail("Can't continue, namespace %s not found" % ctx.attr.namespace)

    init_file_with_namespace = ctx.actions.declare_file("%s/%s/__init__.py" % (namespace_directory, ctx.attr.namespace))
    ctx.actions.write(init_file_with_namespace, "__path__ = __import__('pkgutil').extend_path(__path__, __name__)\n")
    print('wrote %s' % init_file_with_namespace.path)

    # Hack instead of symlinks
    copied_files = []
    for file in srcs_with_namespace:
        newFile = ctx.actions.declare_file(file.path)
        ctx.actions.run_shell(
            arguments = [file.path, newFile.path],
            inputs = [file],
            outputs = [newFile],
            command = "cp $1 $2",
        )
        copied_files += [newFile]

    return DefaultInfo(
        files = depset(copied_files + [init_file_with_namespace]),
        runfiles = ctx.runfiles(
            files=copied_files + [init_file_with_namespace],
            # root_symlinks=symlinks,
        ),
    )

_py_namespace = rule(
    attrs = {
        "srcs": attr.label_list(providers = ["py"]),
        "namespace": attr.string(),
    },
    implementation = _py_namespace_impl,
)

def py_namespace(name, srcs, namespace, imports = ['.']):
    '''
    Adds `__path__ = __import__('pkgutil').extend_path(__path__, __name__)` to the
    init file in the first directory matching namespace
    '''
    _py_namespace(
        name = name + '_namespaced',
        srcs = srcs,
        namespace = namespace
    )

    native.py_library(
        name = name,
        srcs = [':%s_namespaced' % name],
        imports = imports,
    )

Unfortunately I'm stuck specifically because of protobuf. Using google protos, i.e. google/api, generates the google namespace, and using the py_namespace trick is not as straight forward because I start having conflicting file complaints... I suppose if someone tries out the workaround, you could put the namespace in a separate BUILD files :(. Alas I'm out of time to try and fix it, so I'll be installing protobuf with pip on the system for now until this issue can be addressed. Though, rules_protobuf may need a similar change too... 😢.

Update - I was able to verify the workaround for jarco with cherrypy:

py_namespace(
    name = "namespaced_jaraco_classes",
    namespace="jaraco",
    srcs = [
        pip_require("jaraco.classes"),
    ],
    imports = ['./external/pypi__jaraco_classes_1_5'],
)

py_namespace(
    name = "namespaced_jaraco_functions",
    namespace="jaraco",
    srcs = [
        pip_require("jaraco.functools"),
    ],

    imports = ['./external/pypi__jaraco_functools_1_20'],
)


py_library(
    name = "test_lib",
    testonly = True,
    srcs = glob(["test/*/*.py"]),
    imports = ["test"],
    deps = [
        ":lib",
        ":namespaced_jaraco_functions",
        ":namespaced_jaraco_classes",
        "//client_lib/python/cfht_sdk:lib",
        pip_require("pytest"),
        pip_require("flask"),
        pip_require("cherrypy"),
        pip_require("backports.functools-lru-cache"),
        pip_require("tempora"),
        pip_require("more-itertools"),
        pip_require("pytz"),
        pip_require("cheroot"),
        pip_require("portend"),
    ],
)

@therc
Copy link

therc commented Sep 24, 2018

This seems to be a duplicate or, at least, to highly overlap with #14 and #55. The legacy_create_init flag with which the latter was closed doesn't really help with the protobuf/google-auth case. You'd assume that using a Google tool with a bunch of Google libraries would just work, but alas...

@raliste
Copy link

raliste commented Mar 6, 2019

@therc to use protobuf/google apis I created a fork of googleapiclient that simply removes google-auth as a strict dependency from the setup.py. It does work at the expense of using googleapiclient instead of google-cloud packages. The only real benefit of those, at the moment, is gRPC and better idioms, as far as I can see. I think this is the best approach to use python google apis with Bazel at this point in time.

@ali5h
Copy link

ali5h commented Sep 21, 2019

i fixed this issue in https://github.com/ali5h/rules_pip

@skyl
Copy link

skyl commented Oct 7, 2019

This issue is really blocking our adoption of bazel for Python ...

@skyl
Copy link

skyl commented Oct 8, 2019

@ali5h I tried your rules and I can't get it to work with a -e git requirement. For example:

TypeError: Expected a pinned InstallRequirement, got django-mass-edit from git+https://github.com/frnhr/django-mass-edit.git@c424abb7ce43dcb54e822862651821720d52f50b#egg=django-mass-edit (from -r /private/var/tmp/_bazel_i871067/582c43307c634a2c875ee31a5da68b63/external/py/requirements.txt (line 7))
ERROR: no such package '@py//': pip_import failed:  (Traceback (most recent call last):
  File "/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/private/var/tmp/_bazel_i871067/582c43307c634a2c875ee31a5da68b63/external/com_github_alish_rules_pip_lock/tools/piptool.par/__main__.py", line 184, in <module>
  File "/private/var/tmp/_bazel_i871067/582c43307c634a2c875ee31a5da68b63/external/com_github_alish_rules_pip_lock/tools/piptool.par/__main__.py", line 141, in main
  File "/private/var/tmp/_bazel_i871067/582c43307c634a2c875ee31a5da68b63/external/com_github_alish_rules_pip_lock/tools/piptool.par/__main__.py", line 48, in as_tuple

I wonder if you have any ideas about this. Thanks for the contribution though!

@ali5h
Copy link

ali5h commented Oct 9, 2019

@skyl my implementation requires that you pin the version in requirement.txt

@ali5h
Copy link

ali5h commented Oct 9, 2019

generate your requirement.txt file using pip-compile

@thundergolfer
Copy link

thundergolfer commented Dec 13, 2019

@ali5h

I'm trying to use your rules_pip to fix this kind of issue that I'm experiencing with azure-common and azure-storage-blob (which both use the azure namespace, but it doesn't seem to work. I'm looking at the PYTHONPATH and the runfiles of my py_binary and it's running into the same namespace clash as described by OP.

If you've fixed this problem, can you elucidate how you've fixed this issue, so that I can poke through your code and rip out a solution for myself 🙂

@ali5h
Copy link

ali5h commented Dec 13, 2019

i just tested it and added them to the tests, it works, did you compile the reqs with pip-compile?

@thundergolfer
Copy link

thundergolfer commented Dec 13, 2019

Yes I did, but I may have done something else wrong.

Do you mind pointing me at the relevant source code in your project to help me learn about a solution?


Edit: I've got a branch here, https://github.com/thundergolfer/the-one-true-bazel-monorepo/pull/12/files, where I'm trying to get this working.

I have to hack an internal BUILD file to get past Could not find pip-provided dependency: 'enum34', but then get ModuleNotFoundError: No module named 'azure.storage'.

@ali5h
Copy link

ali5h commented Dec 13, 2019

fixed the issues, plz use latest release, thanks for the feedback

@jasonh-logic2020
Copy link

I am using the latest git commit release of rules_python as described in #273, but get the same result: no error from Bazel, but no code loaded from the azure package.

All other libraries load as expected, all versions pinned in requirements.txt.

@alexeagle
Copy link
Collaborator

rules_python 0.1.0 has been released which upstreams the rules_python_external repo. Please switch from pip_import to pip_install which doesn't have this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests