-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Move MMLSPARK info from tools to pip installable package #1439
Comments
it would be nice to have the mmlspark version set in setup.py with the rest of the dependencies, but i think that would mean setup.py would have to modify a separate file at installation (reco_utils/common/spark_utils.py?) to inject the location of the spark package & maven repo. Alternatively we could just put that information directly into spark_utils and put a note in setup.py that if you want to change the default version of mmlspark to modify spark_utils.py? were you thinking that there would be an additional flag in start_or_get_spark() that would add mmlspark, or just store the info and leave it up to the user to retrieve it and pass it to start_or_get_spark() using the packages and repositories inputs? |
I am open to suggestions. I was thinking of defining MMLSPARK_INFO somewhere in I think I don't know about a flag for mmlspark, it may be a nice idea. Wouldn't it conflict with other packages the user may want to install? @yueguoguo @miguelgfierro @loomlike any comments? |
makes sense, for the flag i was thinking we could do something like this at the beginning of get_or_start_spark
|
Vote for adding mmlspark in the spark creation step. It makes less sense to put the very spark specific info into |
I will wrap any calls in the notebooks inside a |
Addressed in #1531 |
Description
pytest tests/unit/examples/test_notebooks_pyspark.py
requires info from directorytools/
:We could move the required MMLSPARK_INFO from
tools/
to a definition built inside the pip package.In which platform does it happen?
Any platform
How do we replicate the issue?
pip install ms-recommenders[examples,spark]
tests/unit/examples/test_notebooks_pyspark.py
withpytest -k criteo
Expected behavior (i.e. solution)
The tests should pass successfully even if
tools/
directory is not presentOther Comments
The text was updated successfully, but these errors were encountered: