-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Loading geni.core creates a default spark session #332
Comments
I have as well the impression, that even "requiring" the various geni namespaces, creates a session at one point in time. `` `` |
so basically I found no way to get a uberjar with geni executing on databricks. |
I finally patched to have an empty map for session-config. This (I belive) still creates a session during "require" (which in my opinon is wrong), but at at least makes the code continue, so I can setup my own session. |
Info
Problem / Steps to reproduce
If
zero-one.geni.core
has been required, it creates a default spark session which impacts the behavior of callingg/create-spark-session
Specifically,
geni.core
loadsgeni.spark-context
which loadsgeni.defaults
, which creates a spark session in anatom
which should probably be adelay
if requiring
zero-one.geni.spark
directly instead (asg
), the spark session is correctly configured.The incorrect behaviour takes effect if core is required at any point before creating the session, so it is a bit problematic. As above, maybe replacing the default with a delay will be sufficient to avoid this.
Thanks for your work on this library!
The text was updated successfully, but these errors were encountered: