Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix setting project_id for gs to bq and bq to gs #30053

Merged
merged 8 commits into from
Mar 20, 2023

Conversation

Yaro1
Copy link
Contributor

@Yaro1 Yaro1 commented Mar 12, 2023

closes: #29958

@boring-cyborg boring-cyborg bot added area:providers provider:google Google (including GCP) related issues labels Mar 12, 2023
@boring-cyborg
Copy link

boring-cyborg bot commented Mar 12, 2023

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
Here are some useful points:

  • Pay attention to the quality of your code (ruff, mypy and type annotations). Our pre-commits will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it's a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: [email protected]
    Slack: https://s.apache.org/airflow-slack

@potiuk
Copy link
Member

potiuk commented Mar 12, 2023

cc: @lwyszomi and team, could you please take a a look and see if that is sound (the #29958 contains nice explanation)?

@Yaro1
Copy link
Contributor Author

Yaro1 commented Mar 12, 2023

Can't understand the error in Static checks

An unexpected error has occurred: CalledProcessError: command: ('/opt/pipx/venvs/apache-airflow-breeze/bin/python', '-mnodeenv', '--prebuilt', '--clean-src', '/home/runner/.cache/pre-commit/repo7ocuinbu/node_env-18.6.0', '-n', '18.6.0')
return code: 1
stdout: (none)

@potiuk
Copy link
Member

potiuk commented Mar 12, 2023

Likely intemittent problem (node installation failed due to conectivity ?)

@Yaro1
Copy link
Contributor Author

Yaro1 commented Mar 12, 2023

oh, got it, thanks

@Yaro1
Copy link
Contributor Author

Yaro1 commented Mar 19, 2023

:(

@potiuk potiuk merged commit af4627f into apache:main Mar 20, 2023
@boring-cyborg
Copy link

boring-cyborg bot commented Mar 20, 2023

Awesome work, congrats on your first merged pull request!

@@ -193,7 +193,7 @@ def _submit_job(

return hook.insert_job(
configuration=configuration,
project_id=hook.project_id,
project_id=configuration["extract"]["sourceTable"]["projectId"],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Yaro1 May I know the reason why we hard code the project id from sourceTable? We got the issue when we try to extract data from Project A but we want to submit job by using our own Project B. This line does not allow us to use our default project id.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sleepy-tiger I agree, we also have this issue. I think the original bug report was based on a misunderstanding of the error, and while this fix does incidentally support the reporter's gcp configuration, I think that is mostly an accident, and it also breaks many other use cases.

see also https://github.com/apache/airflow/pull/30053/files#diff-875bf3d1bfbba7067dc754732c0e416b8ebe7a5b722bc9ac428b98934f04a16fR512 and https://github.com/apache/airflow/pull/30053/files#diff-875bf3d1bfbba7067dc754732c0e416b8ebe7a5b722bc9ac428b98934f04a16fR587, which override the project_id that the user passes in, making it impossible to use a project_id other than what is specified in the source or destination tables. In general, more clarity is needed in distinguishing between which projects are being used for storage, and which are being used for compute.

I plan on filing an issue about this later today if one doesn't already exist, and I'll update here

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was closing old tabs and realized I never updated here -- the issue I filed is here: #32106, and it has been resolved to my satisfaction. You can find links to the relevant conversations from that issue, it got kind of complicated with multiple issues filed and such

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:providers provider:google Google (including GCP) related issues
Projects
None yet
Development

Successfully merging this pull request may close these issues.

GCSToBigQueryOperator does not respect the destination project ID
5 participants