-
Notifications
You must be signed in to change notification settings - Fork 28.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[SPARK-5751] [SQL] Sets SPARK_HOME as SPARK_PID_DIR when running Thrift server test suites #4758
Conversation
Test build #27935 has started for PR 4758 at commit
|
"SPARK_TESTING" -> "0", | ||
// Points SPARK_PID_DIR to SPARK_HOME, otherwise only 1 Thrift server instance can be started | ||
// at a time, which is not Jenkins friendly. | ||
"SPARK_PID_DIR" -> "../../") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am wondering a better path would be Utils.createTempDir(..)
, just in case we want to run the test suite simultaneously in the local machine as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good point.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
actually, can we just use sys.props(java.io.tmpdir)
instead of creating a new directory?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need a different directory for each run, otherwise the PID file will be overwritten when multiple jenkins unit tests run parallel.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oh you're right. Looks like we currently set it to /tmp
already, which is probably what java.io.tmpdir
is anyway in most cases
Test build #27935 has finished for PR 4758 at commit
|
Test PASSed. |
I said I was going to merge this, but instead I left one more comment inline. |
Updated, you can merge it after Jenkins nods :) |
Test build #28077 has started for PR 4758 at commit
|
Test build #28077 has finished for PR 4758 at commit
|
Test PASSed. |
…ft server test suites This is a follow-up of #4720. By default, `spark-daemon.sh` writes PID files under `/tmp`, which makes it impossible to start multiple server instances simultaneously. This PR sets `SPARK_PID_DIR` to Spark home directory to workaround this problem. Many thanks to chenghao-intel for pointing out this issue! <!-- Reviewable:start --> [<img src="https://reviewable.io/review_button.png" height=40 alt="Review on Reviewable"/>](https://reviewable.io/reviews/apache/spark/4758) <!-- Reviewable:end --> Author: Cheng Lian <[email protected]> Closes #4758 from liancheng/thriftserver-pid-dir and squashes the following commits: 252fa0f [Cheng Lian] Uses temporary directory as Thrift server PID directory 1b3d1e3 [Cheng Lian] Sets SPARK_HOME as SPARK_PID_DIR when running Thrift server test suites (cherry picked from commit 8c468a6) Signed-off-by: Cheng Lian <[email protected]>
@andrewor14 I just merged it to master and branch-1.3. |
Cool, thanks. |
Iceberg 0.13.0.3 - ADT 1.1.7 2022-05-20 PRs Merged * Internal: Parquet bloom filter support (apache#594 (https://github.pie.apple.com/IPR/apache-incubator-iceberg/pull/594)) * Internal: AWS Kms Client (apache#630 (https://github.pie.apple.com/IPR/apache-incubator-iceberg/pull/630)) * Internal: Core: Add client-side check of encryption properties (apache#626 (https://github.pie.apple.com/IPR/apache-incubator-iceberg/pull/626)) * Core: Align snapshot summary property names for delete files (apache#4766 (apache/iceberg#4766)) * Core: Add eq and pos delete file counts to snapshot summary (apache#4677 (apache/iceberg#4677)) * Spark 3.2: Clean static vars in SparkTableUtil (apache#4765 (apache/iceberg#4765)) * Spark 3.2: Avoid reflection to load metadata tables in SparkTableUtil (apache#4758 (apache/iceberg#4758)) * Core: Fix query failure when using projection on top of partitions metadata table (apache#4720) (apache#619 (https://github.pie.apple.com/IPR/apache-incubator-iceberg/pull/619)) Key Notes Bloom filter support and Client Side Encryption Features can be used in this release. Both features are only enabled with explicit flags and will not effect existing tables or jobs.
This is a follow-up of #4720. By default,
spark-daemon.sh
writes PID files under/tmp
, which makes it impossible to start multiple server instances simultaneously. This PR setsSPARK_PID_DIR
to Spark home directory to workaround this problem.Many thanks to @chenghao-intel for pointing out this issue!