Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reorganize Spark version specific code structure #76

Closed
jerrychenhf opened this issue Jun 10, 2021 · 0 comments · Fixed by #77
Closed

Reorganize Spark version specific code structure #76

jerrychenhf opened this issue Jun 10, 2021 · 0 comments · Fixed by #77
Labels

Comments

@jerrychenhf
Copy link
Contributor

jerrychenhf commented Jun 10, 2021

To act as the template using profile based compile time source code organization, we need a more standard way to organize the version specific source code. Here is the proposal:

project-root
--src
----main -> common code entry
----test -> common test code entry
--spark-3.0.1 -> version specific entry
----main -> version specific code entry
----test -> version specific test code entry
--spark-3.1.1
----main
----test
...

And we will use build-helper-maven-plugin to add additional version specific source code to build.
Additionally, as part of this refactor, change maven-scala-plugin (old) to scala-maven-plugin (new)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants