Adjust parallelism in spark-tests script to reduce memory footprint [skip ci] #8871
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
mitigate #8729
We haven't figured out why JDK11 test run consumed more memory than other JDK versions and then was OOM killed by the system. (this could be related to different GC strategies in different JDK versions, but the issue was not resolved when trying to use non-default GC in jdk11).
This change is trying to limit the parallelism of integration tests in nightly CI.
We have confirmed that setting parallelism as 5 does not increase the test run duration (verified w/ multiple GPU types),
and this could significantly help reduce the peak of host memory footprint (from >50GiB to >40GiB) in recent CI runs.