-
Notifications
You must be signed in to change notification settings - Fork 370
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use pytest-vcr to avoid network usage #1250
base: main
Are you sure you want to change the base?
Conversation
How would you feel about this data being stored outside of the repo, perhaps via git-lfs? |
That's a possibility; the trouble with git-lfs is the terrible limits. You get 1G storage and 1G download bandwidth per month. Storage is fine given this is only about 17M, but bandwidth may not be. It is shared download bandwidth for all forks. So if all 167 forks attempted to download these files, we'd go over pretty quickly. Either way works for me, but there are advantages and disadvantages to both. |
Hmm, I think I may have had some files already cached, so the cassettes are missing some requests. It probably has to do with the order they're serviced. I wonder if there's a way to cache by URL instead, so we don't have a bunch of duplicates. It may also be useful to set CI to not cache anything to ensure downloading works; kind of depends on what the test is for, I guess. |
If you're using |
I had no idea |
This would be nice to reduce the number of times we get failures from ows + NASA tile servers as well. It looks like you could try and add the git lfs download to the actions cache (some useful links/references in this issue actions/checkout#834) to reduce the bandwidth concerns. Another idea/option might be to try and put the binary data as a GitHub gist url or another repo like |
Rationale
This is a go at reducing network usage entirely by using pytest-vcr to record all network activity.
Implications
Tests will not require network access at all, except when explicitly enabled. However, there is a rather large hit of 17M of recorded cassettes. It will probably compress well, but I'm not sure how much bigger the repo download will be.
This also includes results for SRTM, so they should be at least smoke-tested now on CI. I do not have an API key for OS, so I did not add those cassettes.
In order to ensure that things don't randomly break, the cron job on Travis is set to not use vcr at all.