Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add new cf-xarray demo notebook #88

Merged
merged 5 commits into from
Jan 14, 2022
Merged

Add new cf-xarray demo notebook #88

merged 5 commits into from
Jan 14, 2022

Conversation

dcherian
Copy link
Contributor

@dcherian dcherian commented Apr 5, 2021

Same code almost works on CMOR, CESM-POP, and CESM-MOM6

@dcherian
Copy link
Contributor Author

dcherian commented Apr 5, 2021

@andersy005 we're getting duplicate runs here
image

@klindsay28
Copy link
Collaborator

In my own git repos with notebooks, I've found figure generating notebooks to be a bit of a pain to deal with, and I'm wondering how pop-tools is going to deal with this.

  1. When code that the notebooks rely on is updated, you want to confirm that the notebooks still work. (Out of date documentation can be worse than no documentation.) It looks like this notebook relies on the NCAR environment. I don't see how the notebook could be rerun automatically, much less with external CI, so I think manual action is necessary. That's a pain.

  2. In my own notebooks, I have found that updates to matplotlib can lead to slight changes in plots. I find that to be a pain for notebooks that are under version control. I haven't found a workflow in my own work for isolating such changes that I'm satisfied with. This can be mitigated by pinning a version of matplotlib, which it doesn't look like pop-tools is doing. But pinning a version also has the drawback of potentially running matplotlib with known bugs.

As the number of notebooks in pop-tools increases, I'm wondering if there is a recommended not-so-painful workflow for navigating these issues/tensions.

@dcherian
Copy link
Contributor Author

dcherian commented Jul 8, 2021

The code is slightly simpler now but I've opened more upstream issues to see if we can simplify further. Let's wait on merging for a while.

@klindsay28 one solution could be to use jupyter-book to build the notebooks rather than rendering notebooks with output. That will at least check if things run but won't check that output figures are exactly the same (We'd need to make the needed datasets public though). matplotlib runs regression tests with a pytest plugin (I think) that checks for approximate equality between images.

@dcherian
Copy link
Contributor Author

Merging since it's useful. I've disabled execution so Keith's point still stands.

@dcherian dcherian merged commit e6c3682 into master Jan 14, 2022
@dcherian dcherian deleted the cf-x branch January 14, 2022 19:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants