Skip to content
This repository has been archived by the owner on Sep 18, 2023. It is now read-only.

gazelle failed to run with spark local #725

Closed
jackylee-ch opened this issue Jan 28, 2022 · 2 comments
Closed

gazelle failed to run with spark local #725

jackylee-ch opened this issue Jan 28, 2022 · 2 comments
Labels
bug Something isn't working

Comments

@jackylee-ch
Copy link
Contributor

If we run with spark local, ExecutorManager.scala#L46 will throw index out of range exception.

This is because there is no YarnCoarseGrainedExecutorBackend in our environment, thus we use Random.nextInt to get random index.

@zhouyuan
Copy link
Collaborator

@stczwd Thanks for reporting! the numa binding feature is not so robust, will look into this

@jackylee-ch
Copy link
Contributor Author

@stczwd Thanks for reporting! the numa binding feature is not so robust, will look into this

Thanks for your attension. spark.oap.sql.columnar.numaBinding=false worked with me. I have change the code with some style changes, please check it again, thanks

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants