Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Modify transfer operators to handle more data #22495

Merged
merged 3 commits into from
Apr 4, 2022
Merged

Conversation

mwallace582
Copy link
Contributor

This addresses an issue where large data imports can result in filling
all available disk space and cause the task to fail.

Previously all data would be written out to disk before any was uploaded
to GCS. Now each data chunk is written to GCS and immediately freed.


^ Add meaningful description above

Read the Pull Request Guidelines for more information.
In case of fundamental code change, Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in UPDATING.md.

@mwallace582 mwallace582 requested a review from turbaszek as a code owner March 23, 2022 22:29
@boring-cyborg boring-cyborg bot added area:providers provider:google Google (including GCP) related issues labels Mar 23, 2022
@boring-cyborg
Copy link

boring-cyborg bot commented Mar 23, 2022

Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contribution Guide (https://github.com/apache/airflow/blob/main/CONTRIBUTING.rst)
Here are some useful points:

  • Pay attention to the quality of your code (flake8, mypy and type annotations). Our pre-commits will help you with that.
  • In case of a new feature add useful documentation (in docstrings or in docs/ directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it.
  • Consider using Breeze environment for testing locally, it’s a heavy docker but it ships with a working Airflow and a lot of integrations.
  • Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
  • Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
  • Be sure to read the Airflow Coding style.
    Apache Airflow is a community-driven project and together we are making it better 🚀.
    In case of doubts contact the developers at:
    Mailing List: [email protected]
    Slack: https://s.apache.org/airflow-slack

Copy link
Member

@potiuk potiuk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks nice, but I guess some tests will need to be updated too (let's see)

@mwallace582
Copy link
Contributor Author

Thanks for taking a look and initiating the tests! I thought I had run through all the tests locally, but evidently not, as the failures appear to be related to my work. I'll get those tests fixed shortly.

This addresses an issue where large data imports can result in filling
all available disk space and cause the task to fail.

Previously all data would be written out to disk before any was uploaded
to GCS. Now each data chunk is written to GCS and immediately freed.
@potiuk potiuk merged commit 99b0211 into apache:main Apr 4, 2022
@boring-cyborg
Copy link

boring-cyborg bot commented Apr 4, 2022

Awesome work, congrats on your first merged pull request!

@potiuk
Copy link
Member

potiuk commented Apr 4, 2022

Cool!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area:providers provider:google Google (including GCP) related issues
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants