Skip to content

Commit

Permalink
Merge remote-tracking branch 'upstream/main' into feedstock_rc
Browse files Browse the repository at this point in the history
  • Loading branch information
trexfeathers committed Oct 25, 2023
2 parents cbee22f + f30db0d commit 75d2c8e
Show file tree
Hide file tree
Showing 45 changed files with 1,373 additions and 352 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/benchmarks_run.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ jobs:
env:
IRIS_TEST_DATA_LOC_PATH: benchmarks
IRIS_TEST_DATA_PATH: benchmarks/iris-test-data
IRIS_TEST_DATA_VERSION: "2.19"
IRIS_TEST_DATA_VERSION: "2.21"
# Lets us manually bump the cache to rebuild
ENV_CACHE_BUILD: "0"
TEST_DATA_CACHE_BUILD: "2"
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/ci-manifest.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,4 +23,4 @@ concurrency:
jobs:
manifest:
name: "check-manifest"
uses: scitools/workflows/.github/workflows/ci-manifest.yml@2023.09.1
uses: scitools/workflows/.github/workflows/ci-manifest.yml@2023.10.0
2 changes: 1 addition & 1 deletion .github/workflows/ci-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ jobs:
session: "tests"

env:
IRIS_TEST_DATA_VERSION: "2.19"
IRIS_TEST_DATA_VERSION: "2.21"
ENV_NAME: "ci-tests"

steps:
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/refresh-lockfiles.yml
Original file line number Diff line number Diff line change
Expand Up @@ -14,5 +14,5 @@ on:

jobs:
refresh_lockfiles:
uses: scitools/workflows/.github/workflows/refresh-lockfiles.yml@2023.09.1
uses: scitools/workflows/.github/workflows/refresh-lockfiles.yml@2023.10.0
secrets: inherit
2 changes: 1 addition & 1 deletion .github/workflows/stale.yml
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ jobs:
If you still care about this issue, then please either:
* Re-open this issue, if you have sufficient permissions, or
* Add a comment pinging `@SciTools/iris-devs` who will re-open on your behalf.
* Add a comment stating that this is still relevant and someone will re-open it on your behalf.
# Comment on the staled prs while closed.
close-pr-message: |
Expand Down
6 changes: 3 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ minimum_pre_commit_version: 1.21.0

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
rev: v4.5.0
hooks:
# Prevent giant files from being committed.
- id: check-added-large-files
Expand All @@ -29,14 +29,14 @@ repos:
- id: no-commit-to-branch

- repo: https://github.com/codespell-project/codespell
rev: "v2.2.5"
rev: "v2.2.6"
hooks:
- id: codespell
types_or: [asciidoc, python, markdown, rst]
additional_dependencies: [tomli]

- repo: https://github.com/psf/black
rev: 23.9.1
rev: 23.10.0
hooks:
- id: black
pass_filenames: false
Expand Down
2 changes: 1 addition & 1 deletion benchmarks/benchmarks/cperf/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ def setup(self, file_type, three_d, three_times):
if three_d:
create_kwargs["n_levels"] = 71

# Will re-use a file if already present.
# Will reuse a file if already present.
file_path = make_cubesphere_testfile(**create_kwargs)

else:
Expand Down
8 changes: 7 additions & 1 deletion benchmarks/benchmarks/experimental/ugrid/regions_combine.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
from iris.experimental.ugrid import PARSE_UGRID_ON_LOAD
from iris.experimental.ugrid.utils import recombine_submeshes

from ... import TrackAddedMemoryAllocation
from ... import TrackAddedMemoryAllocation, on_demand_benchmark
from ...generate_data.ugrid import make_cube_like_2d_cubesphere


Expand Down Expand Up @@ -200,6 +200,8 @@ class CombineRegionsComputeRealData(MixinCombineRegions):
def time_compute_data(self, n_cubesphere):
_ = self.recombined_cube.data

# Vulnerable to noise, so disabled by default.
@on_demand_benchmark
@TrackAddedMemoryAllocation.decorator
def track_addedmem_compute_data(self, n_cubesphere):
_ = self.recombined_cube.data
Expand All @@ -217,6 +219,8 @@ def time_save(self, n_cubesphere):
# Save to disk, which must compute data + stream it to file.
save(self.recombined_cube, "tmp.nc")

# Vulnerable to noise, so disabled by default.
@on_demand_benchmark
@TrackAddedMemoryAllocation.decorator
def track_addedmem_save(self, n_cubesphere):
save(self.recombined_cube, "tmp.nc")
Expand Down Expand Up @@ -245,6 +249,8 @@ def time_stream_file2file(self, n_cubesphere):
# Save to disk, which must compute data + stream it to file.
save(self.recombined_cube, "tmp.nc")

# Vulnerable to noise, so disabled by default.
@on_demand_benchmark
@TrackAddedMemoryAllocation.decorator
def track_addedmem_stream_file2file(self, n_cubesphere):
save(self.recombined_cube, "tmp.nc")
2 changes: 1 addition & 1 deletion benchmarks/benchmarks/generate_data/stock.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ def _external(func_name_, temp_file_dir, **kwargs_):
)
if not REUSE_DATA or not save_path.is_file():
# The xios functions take control of save location so need to move to
# a more specific name that allows re-use.
# a more specific name that allows reuse.
actual_path = run_function_elsewhere(
_external,
func_name_=func_name,
Expand Down
2 changes: 1 addition & 1 deletion benchmarks/benchmarks/load/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ def time_realise(self, _, __, ___, ____) -> None:


class STASHConstraint:
# xyz sizes mimic LoadAndRealise to maximise file re-use.
# xyz sizes mimic LoadAndRealise to maximise file reuse.
params = [[(2, 2, 2), (1280, 960, 5), (2, 2, 1000)], ["FF", "PP"]]
param_names = ["xyz", "file_format"]

Expand Down
4 changes: 3 additions & 1 deletion benchmarks/benchmarks/save.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
from iris import save
from iris.experimental.ugrid import save_mesh

from . import TrackAddedMemoryAllocation
from . import TrackAddedMemoryAllocation, on_demand_benchmark
from .generate_data.ugrid import make_cube_like_2d_cubesphere


Expand Down Expand Up @@ -47,6 +47,8 @@ def time_netcdf_save_mesh(self, n_cubesphere, is_unstructured):
if is_unstructured:
self._save_mesh(self.cube)

# Vulnerable to noise, so disabled by default.
@on_demand_benchmark
@TrackAddedMemoryAllocation.decorator
def track_addedmem_netcdf_save(self, n_cubesphere, is_unstructured):
# Don't need to copy the cube here since track_ benchmarks don't
Expand Down
2 changes: 1 addition & 1 deletion benchmarks/bm_runner.py
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ def _prep_data_gen_env() -> None:
else:
echo("Setting up the data generation environment ...")
# Get Nox to build an environment for the `tests` session, but don't
# run the session. Will re-use a cached environment if appropriate.
# run the session. Will reuse a cached environment if appropriate.
_subprocess_runner(
[
"nox",
Expand Down
2 changes: 1 addition & 1 deletion docs/gallery_code/meteorology/plot_COP_1d.py
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ def main():
)

# Generate area-weights array. As e1 and a1b are on the same grid we can
# do this just once and re-use. This method requires bounds on lat/lon
# do this just once and reuse. This method requires bounds on lat/lon
# coords, so let's add some in sensible locations using the "guess_bounds"
# method.
e1.coord("latitude").guess_bounds()
Expand Down
22 changes: 6 additions & 16 deletions docs/gallery_code/meteorology/plot_COP_maps.py
Original file line number Diff line number Diff line change
Expand Up @@ -171,23 +171,13 @@ def main():
)
plt.gca().coastlines()

# Now add a colourbar who's leftmost point is the same as the leftmost
# point of the left hand plot and rightmost point is the rightmost
# point of the right hand plot.

# Get the positions of the 2nd plot and the left position of the 1st plot.
left, bottom, width, height = ax_array[1].get_position().bounds
first_plot_left = ax_array[0].get_position().bounds[0]

# The width of the colorbar should now be simple.
width = left - first_plot_left + width

# Add axes to the figure, to place the colour bar.
colorbar_axes = fig.add_axes([first_plot_left, 0.18, width, 0.03])

# Add the colour bar.
# Now add a colour bar which spans the two plots. Here we pass Figure.axes
# which is a list of all (two) axes currently on the figure. Note that
# these are different to the contents of ax_array, because those were
# standard Matplotlib Axes that Iris automatically replaced with Cartopy
# GeoAxes.
cbar = plt.colorbar(
contour_result, colorbar_axes, orientation="horizontal"
contour_result, ax=fig.axes, aspect=60, orientation="horizontal"
)

# Label the colour bar and add ticks.
Expand Down
6 changes: 3 additions & 3 deletions docs/src/developers_guide/contributing_documentation_easy.rst
Original file line number Diff line number Diff line change
Expand Up @@ -81,9 +81,9 @@ Describing what you've changed and why will help the person who reviews your cha
.. tip::

If you're not sure that you're making your pull request right, or have a
question, then make it anyway! You can then comment on it tagging
``@SciTools/iris-devs`` to ask your question (then edit your pull request if
you need to).
question, then make it anyway! You can then comment on it to ask your
question, then someone from the dev team will be happy to help you out (then
edit your pull request if you need to).

What Happens Next?
^^^^^^^^^^^^^^^^^^
Expand Down
Loading

0 comments on commit 75d2c8e

Please sign in to comment.