-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Task]: Run some BigQuery integration tests only on one Dataflow Python3.X PostCommit #23573
Comments
.take-issue |
add-labels python, io, gcp |
Considered as a subtask of #20930 |
.add-labels python, io, gcp |
Label cannot be managed because it does not exist in the repo. Please check your spelling. |
Currently on hold as GitHub migration involves splitting postcommit tests. Downgrade to P2 for tracking if it still occurs when GitHub migration completed. |
.remove-labels P1 .add-labels P2 |
let me close this. Superseded by #25970 which will reduce the run time of Python Postcommit by an hour |
What needs to happen?
There is a Dataflow Python PostCommit flake due to too many BigQuery tests running same time.
Encountered in recent release process: #23200 (comment)
and also observed in #23014 (comment)
Python3.x PostCommit has been timed out (4 h limit) with high probability. The most time consuming testing module is BigQuery which costs 2 h run time. As we are going to support python3.10, quota issue is going to be worse.
We should audit the current BigQuery integration tests on Dataflow to see if running on single py verision suffices (as we did for spannerIOIT), and redistribute these tests.
Issue Priority
Priority: 1
Issue Component
Component: test-failures
The text was updated successfully, but these errors were encountered: