Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Various iNaturalist updates #3846

Merged
merged 5 commits into from
Mar 8, 2024
Merged

Various iNaturalist updates #3846

merged 5 commits into from
Mar 8, 2024

Conversation

AetherUnbound
Copy link
Collaborator

@AetherUnbound AetherUnbound commented Feb 29, 2024

Fixes

Fixes #3631 by @rwidom

Description

This PR fixes a number of aspects of the iNaturalist ingestion, with the hope that we should be able to turn it back on in production after this!

  • Addresses the issue in Update Catalog of Life data source url for iNaturalist DAG #3631 by pointing to a "latest" version of the COL data. This link doesn't include a build/date/version number, and so shouldn't need updating down the line.
  • Fixed an issue where the media type was not provided when running various common.sql methods directly.
  • Updated the iNaturalist schemas, which had changed since the last time this was successfully run.
  • Use the minimum photo ID when creating the initial ranges, rather than starting from 0

It seems it had been a minute since we've gotten iNaturalist running 😅 There were quite a few changes, but I think I've covered them all and added references to documentation for where to look if they change again.

Testing Instructions

Due to our use of parameters over variables for defining when to skip the removal of the source data, this is a little tricky

  1. Enable the iNaturalist DAG, then disable it. A DAG run should start, immediately mark the entire DAG as failed.
  2. Trigger a new run of the DAG, and uncheck sql_rm_source_data_after_ingesting in the parameter settings before running
  3. Let the DAG execute and ingest data! It won't be a whole lot of images (and the initial download of the COL data is about 600MB so be advised it will take time!)

Checklist

  • My pull request has a descriptive title (not a vague title likeUpdate index.md).
  • My pull request targets the default branch of the repository (main) or a parent feature branch.
  • My commit messages follow best practices.
  • My code follows the established code style of the repository.
  • I added or updated tests for the changes I made (if applicable).
  • I added or updated documentation (if applicable).
  • I tried running the project locally and verified that there are no visible errors.
  • I ran the DAG documentation generator (if applicable).

Developer Certificate of Origin

Developer Certificate of Origin
Developer Certificate of Origin
Version 1.1

Copyright (C) 2004, 2006 The Linux Foundation and its contributors.
1 Letterman Drive
Suite D4700
San Francisco, CA, 94129

Everyone is permitted to copy and distribute verbatim copies of this
license document, but changing it is not allowed.


Developer's Certificate of Origin 1.1

By making a contribution to this project, I certify that:

(a) The contribution was created in whole or in part by me and I
    have the right to submit it under the open source license
    indicated in the file; or

(b) The contribution is based upon previous work that, to the best
    of my knowledge, is covered under an appropriate open source
    license and I have the right under that license to submit that
    work with modifications, whether created in whole or in part
    by me, under the same open source license (unless I am
    permitted to submit under a different license), as indicated
    in the file; or

(c) The contribution was provided directly to me by some other
    person who certified (a), (b) or (c) and I have not modified
    it.

(d) I understand and agree that this project and the contribution
    are public and that a record of the contribution (including all
    personal information I submit with it, including my sign-off) is
    maintained indefinitely and may be redistributed consistent with
    this project or the open source license(s) involved.

@AetherUnbound AetherUnbound requested a review from a team as a code owner February 29, 2024 00:44
@AetherUnbound AetherUnbound added 🟨 priority: medium Not blocking but should be addressed soon 🛠 goal: fix Bug fix 💻 aspect: code Concerns the software code in the repository 🧱 stack: catalog Related to the catalog and Airflow DAGs labels Feb 29, 2024
Copy link
Contributor

@obulat obulat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I can't test this PR because I get this error in check_for_file_updates step:

[2024-03-04, 06:40:29 UTC] {base.py:83} INFO - Using connection ID 'aws_default' for task execution.
[2024-03-04, 06:40:29 UTC] {connection_wrapper.py:378} INFO - AWS Connection (conn_id='aws_default', conn_type='aws') credentials retrieved from login and password.
[2024-03-04, 06:40:29 UTC] {taskinstance.py:2698} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/taskinstance.py", line 428, in _execute_task
    result = execute_callable(context=context, **execute_callable_kwargs)
  File "/home/airflow/.local/lib/python3.10/site-packages/airflow/operators/python.py", line 199, in execute
    return_value = self.execute_callable()
  File "/home/airflow/.local/lib/python3.10/site-packages/airflow/operators/python.py", line 216, in execute_callable
    return self.python_callable(*self.op_args, **self.op_kwargs)
  File "/opt/airflow/catalog/dags/providers/provider_api_scripts/inaturalist.py", line 205, in compare_update_dates
    last_modified = s3_client.head_object(
  File "/home/airflow/.local/lib/python3.10/site-packages/botocore/client.py", line 553, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/home/airflow/.local/lib/python3.10/site-packages/botocore/client.py", line 1009, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (404) when calling the HeadObject operation: Not Found

I tried setting the AWS_ACCESS_KEY and AWS_SECRET_KEY values (from infrastructure repo) in the .env file, but the error persists.

Copy link
Collaborator

@stacimc stacimc left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🥳 This works for me!

I found it fiddly to try to stop the initial dagrun in time, so for testing it might be easier to instead just locally temporarily update the default for sql_rm_source_data_after_ingesting to False here. I got 14 images.

@obulat have you modified AIRFLOW_CONN_AWS_DEFAULT in your local .env, maybe from testing something else? Can you reset it to the default from env.template:

AIRFLOW_CONN_AWS_DEFAULT=aws://test_key:test_secret@?region_name=us-east-1&endpoint_url=http%3A%2F%2Fs3%3A5000

@obulat
Copy link
Contributor

obulat commented Mar 7, 2024

@obulat have you modified AIRFLOW_CONN_AWS_DEFAULT in your local .env, maybe from testing something else? Can you reset it to the default from env.template:

Thank you, @stacimc, I think resetting the local .env fixed the issue.

Next up, an error in the load_catalog_of_liefe_names 😆

2024-03-07, 20:12:07 +03] {inaturalist.py:232} INFO - /var/workflow_output/COL_archive.zip exists, so no Catalog of Life download.
[2024-03-07, 20:12:07 +03] {taskinstance.py:2728} ERROR - Task failed with exception
Traceback (most recent call last):
  File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/taskinstance.py", line 439, in _execute_task
    result = _execute_callable(context=context, **execute_callable_kwargs)
  File "/home/airflow/.local/lib/python3.10/site-packages/airflow/models/taskinstance.py", line 414, in _execute_callable
    return execute_callable(context=context, **execute_callable_kwargs)
  File "/home/airflow/.local/lib/python3.10/site-packages/airflow/operators/python.py", line 200, in execute
    return_value = self.execute_callable()
  File "/home/airflow/.local/lib/python3.10/site-packages/airflow/operators/python.py", line 217, in execute_callable
    return self.python_callable(*self.op_args, **self.op_kwargs)
  File "/opt/airflow/catalog/dags/providers/provider_api_scripts/inaturalist.py", line 257, in load_catalog_of_life_names
    with zipfile.ZipFile(OUTPUT_DIR / local_zip_file) as z:
  File "/usr/local/lib/python3.10/zipfile.py", line 1269, in __init__
    self._RealGetContents()
  File "/usr/local/lib/python3.10/zipfile.py", line 1336, in _RealGetContents
    raise BadZipFile("File is not a zip file")

@AetherUnbound
Copy link
Collaborator Author

@obulat are you copying the COL file into the container? 😮 or just running the DAG?

@obulat
Copy link
Contributor

obulat commented Mar 7, 2024

@obulat are you copying the COL file into the container? 😮 or just running the DAG?

Just running the DAG... Where is the COL file supposed to be? I tried looking inside the catalog container, but couldn't find anything inside that folder

@stacimc
Copy link
Collaborator

stacimc commented Mar 7, 2024

@obulat I also received that error when I first tried. My guess was that it had something to do with not stopping the first, automatic dagrun in time and getting into a weird failure state. I wiped my local catalog and then retried it, setting the default for sql_rm_source_data_after_ingesting to False here instead so you can just turn the DAG on and test immediately.

@openverse-bot
Copy link
Collaborator

Based on the medium urgency of this PR, the following reviewers are being gently reminded to review this PR:

@rwidom
This reminder is being automatically generated due to the urgency configuration.

Excluding weekend1 days, this PR was ready for review 5 day(s) ago. PRs labelled with medium urgency are expected to be reviewed within 4 weekday(s)2.

@AetherUnbound, if this PR is not ready for a review, please draft it to prevent reviewers from getting further unnecessary pings.

Footnotes

  1. Specifically, Saturday and Sunday.

  2. For the purpose of these reminders we treat Monday - Friday as weekdays. Please note that the operation that generates these reminders runs at midnight UTC on Monday - Friday. This means that depending on your timezone, you may be pinged outside of the expected range.

@obulat
Copy link
Contributor

obulat commented Mar 8, 2024

@obulat I also received that error when I first tried. My guess was that it had something to do with not stopping the first, automatic dagrun in time and getting into a weird failure state. I wiped my local catalog and then retried it, setting the default for sql_rm_source_data_after_ingesting to False here instead so you can just turn the DAG on and test immediately.

Thank you, @stacimc! I also didn't stop the first DAG run before it went on to load_catalog_of_life_names step, so it c I added a check for validity of the zip file that deletes the COL file if it's not valid:

        zpath = OUTPUT_DIR / local_zip_file
        if zpath.exists():
            try:
                zipfile.ZipFile(zpath)
            except zipfile.BadZipFile:
                logger.info("COL file is corrupt, so deleting it.")
                zpath.unlink()

I also had an error in the s3 steps. Turns out, I didn't have the AWS variables set in the root .env file. That was the reason why my load_to_s3 container always had an error in it.

After I added the variables, the whole DAG ran smoothly.

Copy link
Contributor

@obulat obulat left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I managed to successfully run the DAG locally, and the code and test changes look great.
@AetherUnbound, do you think the error I had with the COL zip file can ever happen in production?

@AetherUnbound
Copy link
Collaborator Author

@AetherUnbound, do you think the error I had with the COL zip file can ever happen in production?

It definitely should not! iNaturalist, like our other provider DAGs, has "max active runs" set to 1. The only reason this happened locally was because we triggered a manual run, which overrides some of these rules and caused the race condition you mentioned. #3847 should help with this scenario in the future because we can just enable the DAG and let the scheduled workflow run instead of having to trigger it with a parameter.

Thanks for the reviews, and the assistance troubleshooting @stacimc!

@AetherUnbound AetherUnbound merged commit bbea9e6 into main Mar 8, 2024
67 checks passed
@AetherUnbound AetherUnbound deleted the fix/inaturalist-col-data branch March 8, 2024 17:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
💻 aspect: code Concerns the software code in the repository 🛠 goal: fix Bug fix 🟨 priority: medium Not blocking but should be addressed soon 🧱 stack: catalog Related to the catalog and Airflow DAGs
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

Update Catalog of Life data source url for iNaturalist DAG
4 participants