Skip to content

Commit

Permalink
💄
Browse files Browse the repository at this point in the history
  • Loading branch information
dionhaefner committed Nov 4, 2021
1 parent 7851309 commit f3be41b
Show file tree
Hide file tree
Showing 3 changed files with 71 additions and 52 deletions.
2 changes: 1 addition & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ instance/

# Sphinx documentation
doc/_build/
doc/savefig
doc/_generated/

# PyBuilder
target/
Expand Down
4 changes: 2 additions & 2 deletions doc/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Pillow==8.4.0
Sphinx==4.2.0
sphinx-rtd-theme==1.0.0
ipython==7.29.0
numpy==1.21.3
netCDF4==1.5.8
xarray==0.19.0
matplotlib==3.4.3
matplotlib==3.4.3
cmocean==2.0
117 changes: 68 additions & 49 deletions doc/tutorial/analysis.rst
Original file line number Diff line number Diff line change
@@ -1,33 +1,40 @@
.. ipython:: python
:suppress:
Analysis of Veros output
========================

import os
from veros import tools
In this tutorial, we will use `xarray <http://xarray.pydata.org/en/stable/>`__ and `matplotlib <https://matplotlib.org>`__ to load, analyze, and plot the model output. We will also use the `cmocean colormaps <https://matplotlib.org/cmocean/>`__. You can run these commands in `IPython <https://ipython.readthedocs.io/en/stable/>`__ or a `Jupyter Notebook <https://jupyter.org>`__. Just make sure to install the dependencies first::

OUTPUT_FILES = tools.get_assets("tutorial_analysis", os.path.join("tutorial", "analysis-assets.json"))
$ pip install xarray matplotlib netcdf4 cmocean

Analysis of Veros output
========================
The analysis below is performed for 100 yr integration of the :doc:`global_4deg </reference/setups/4deg>` setup from the :doc:`setup gallery </reference/setup-gallery>`.

In this tutorial, we will use `xarray <http://xarray.pydata.org/en/stable/>`__, `numpy <https://numpy.org>`__ and `matplotlib <https://matplotlib.org>`__ to load and analyze the model output. You can run these commands in `IPython <https://ipython.readthedocs.io/en/stable/>`__ or a `Jupyter Notebook <https://jupyter.org>`__. Just make sure to install the dependencies first::
If you want to run this analysis yourself, you can `download the data here <https://sid.erda.dk/cgi-sid/ls.py?share_id=CD8UzHCj2Q;current_dir=inputdata/tutorial_analysis;flags=f>`__. We access the files through the dictionary ``OUTPUT_FILES``, which contains the paths to the 4 different files:

$ pip install numpy xarray matplotlib netcdf4
.. ipython:: python
The analysis below is performed for 100 yr integration of :doc:`global_4deg </reference/setups/4deg>` from the :doc:`setup gallery </reference/setup-gallery>`.
The model output is preloaded from `remote public directory <https://sid.erda.dk/cgi-sid/ls.py?share_id=CD8UzHCj2Q;current_dir=inputdata/tutorial_analysis;flags=f>`__ and accessed through ``OUTPUT_FILES`` dictionary, which contains paths to 4 different files:
OUTPUT_FILES = {
"snapshot": "4deg.snapshot.nc",
"averages": "4deg.averages.nc",
"overturning": "4deg.overturning.nc",
"energy": "4deg.energy.nc",
}
.. ipython:: python
:suppress:
for key in OUTPUT_FILES.keys():
print(OUTPUT_FILES[key])
# actually, we are loading input files through the Veros asset mechanism
import os
from veros import tools
OUTPUT_FILES = tools.get_assets("tutorial_analysis", os.path.join("tutorial", "analysis-assets.json"))
So, when we open ``OUTPUT_FILES["averages"]``, it means we open ``4deg.averages.nc`` file from our local directory. At the very beginning we will need to load ``xarray``. To do so, execute the following code:
Let's start by importing some packages:

.. ipython:: python
import xarray as xr
import numpy as np
import cmocean
The ``xarray`` module provides data structure and API for working with labeled N-dimensional arrays. These labels have encoded information about how the arrays' values map to locations in space, time, etc.
Most of the heavy lifting will be done by ``xarray``, which provides a data structure and API for working with labeled N-dimensional arrays. ``xarray`` datasets automatically keep track how the values of the underlying arrays map to locations in space and time, which makes them immensely useful for analyzing model output.

Load and manipulate averages
----------------------------
Expand All @@ -36,93 +43,105 @@ In order to load our first output file and display its content execute the follo

.. ipython:: python
ds = xr.open_dataset(OUTPUT_FILES["averages"], decode_times=False)
ds
ds_avg = xr.open_dataset(OUTPUT_FILES["averages"])
ds_avg
We can easily access/modify individual data variable and its attributes. To demonstrate it let's convert the units of baratropic stream function from :math:`\frac{m^{3}}{s}` to :math:`Sv` for better convenience:
We can easily access/modify individual data variables and their attributes. To demonstrate this let's convert the units of the barotropic stream function from :math:`\frac{m^{3}}{s}` to :math:`Sv`:

.. ipython:: python
psi = ds.psi / 1e6
psi.attrs["units"] = "Sv"
ds_avg["psi"] = ds_avg.psi / 1e6
ds_avg["psi"].attrs["units"] = "Sv"
To select values of ``psi`` by its integer location over ``Time`` coordinate (last slice) and plot it execute:
Now, we select the last time slice of ``psi`` and plot it:

.. ipython:: python
:okwarning:
@savefig psi.png width=5in
psi.isel(Time=-1).plot.contourf(levels=50)
ds_avg["psi"].isel(Time=-1).plot.contourf(levels=50, cmap="cmo.balance")
In order to compute the decadal mean (of the last 10yrs) of zonal-mean ocean salinity use the following command:

.. ipython:: python
:okwarning:
@savefig salt.png width=5in
ds['salt'].isel(Time=slice(-10,None)).mean(dim=('Time', 'xt')).plot.contourf(levels=50, cmap='ocean')
(
ds_avg["salt"]
.isel(Time=slice(-10,None))
.mean(dim=("Time", "xt"))
.plot.contourf(levels=50, cmap="cmo.haline")
)
One can also compute meridional-mean temperature. Since the model output is defined on a regular latitude/ longitude grid, the grid cell area decreases towards the pole.
For a rectangular grid the cosine of the latitude is proportional to the grid cell area, thus we can compute and use the following weights to adjust the temperature variable:
One can also compute meridional mean temperature. Since the model output is defined on a regular latitude / longitude grid, the grid cell area decreases towards the pole.
To get an accurate mean value, we need to weight each cell by its area:

.. ipython:: python
import numpy as np
weights = np.cos(np.deg2rad(ds.yt))
weights.name = "weights"
weights
temp_weighted = ds['temp'].isel(Time=-1).weighted(weights)
ds_snap = xr.open_dataset(OUTPUT_FILES["snapshot"])
# use cell area as weights, replace missing values (land) with 0
weights = ds_snap["area_t"].fillna(0)
Now, we can calculate weighted mean temperature over meridians and plot it:
Now, we can calculate the meridional mean temperature (via ``xarray``'s ``.weighted`` method) and plot it:

.. ipython:: python
:okwarning:
@savefig temp.png width=5in
temp_weighted.mean(dim='yt').plot.contourf(vmin=-2, vmax=22, levels=25, cmap='inferno')
temp_weighted = (
ds_avg["temp"]
.isel(Time=-1)
.weighted(weights)
.mean(dim="yt")
.plot.contourf(vmin=-2, vmax=22, levels=25, cmap="cmo.thermal")
)
Explore overturning circulation
-------------------------------

.. ipython:: python
ds = xr.open_dataset(OUTPUT_FILES["overturning"], decode_times=False)
ds
ds_ovr = xr.open_dataset(OUTPUT_FILES["overturning"])
ds_ovr
Let's convert the units of meridional overturning circulation (MOC) from :math:`\frac{m^{3}}{s}` to :math:`Sv` and plot it:
Let"s convert the units of meridional overturning circulation (MOC) from :math:`\frac{m^{3}}{s}` to :math:`Sv` and plot it:

.. ipython:: python
:okwarning:
vsf_depth = ds.vsf_depth / 1e6
vsf_depth.attrs["long_name"] = "MOC"
vsf_depth.attrs["units"] = "Sv"
ds_ovr["vsf_depth"] = ds_ovr.vsf_depth / 1e6
ds_ovr.vsf_depth.attrs["long_name"] = "MOC"
ds_ovr.vsf_depth.attrs["units"] = "Sv"
@savefig vsf_depth_2d.png width=5in
vsf_depth.isel(Time=-1).plot.contourf(levels=50)
ds_ovr.vsf_depth.isel(Time=-1).plot.contourf(levels=50, cmap="cmo.balance")
Plot time series
----------------

To inspect coordinates ``zw``, ``yu``, ``Time`` to be used for plotting of MOC time series execute:
Let's have a look at the ``Time`` coordinate of the dataset:

.. ipython:: python
ds['zw']
ds['yu']
vsf_depth['Time'].isel(Time=slice(10,))
ds_ovr["Time"].isel(Time=slice(10,))
We can see that the ``Time`` coordinate is given in days (a year corresponds to 360 days here). In order to have a more
meaningful x-axis in our figures, we divide the ``Time`` coordinate by the number of days per year and change its unit:
We can see that it has the type ``np.timedelta64``, which by default has a resolution of nanoseconds. In order to have a more
meaningful x-axis in our figures, we add another coordinate "years" by dividing ``Time`` by the length of a year (360 days in Veros):

.. ipython:: python
vsf_depth['Time'] = vsf_depth['Time'] / 360.
vsf_depth.Time.attrs['units'] = 'year'
years = ds_ovr["Time"] / np.timedelta64(360, "D")
ds_ovr = ds_ovr.assign_coords(years=("Time", years.data))
Let's select values of array by labels instead of integer location and plot a time series of the overturning minimum between 40°N and 60°N and 550-1800m depth:
Let's select values of array by labels instead of index location and plot a time series of the overturning minimum between 40°N and 60°N and 550-1800m depth, with years on the x-axis:

.. ipython:: python
@savefig vsf_depth_min.png width=5in
vsf_depth.sel(zw=slice(-1810., -550.), yu=slice(40., 60.)).min(axis=(1,2)).plot()
(
ds_ovr.vsf_depth
.sel(zw=slice(-1810., -550.), yu=slice(40., 60.))
.min(dim=("yu", "zw"))
.plot(x="years")
)

0 comments on commit f3be41b

Please sign in to comment.