Skip to content

Commit

Permalink
Restructure documentation and add or streamline beginner-friendly exa…
Browse files Browse the repository at this point in the history
…mples (#265)

* fix deramp residual and optimizer

* Restructure documentation sections and headers

* Capitalize xDEM

* Update what_is_xdem.rst

* Update install_xdem.rst

* Update what_is_xdem.rst

* Update index.rst

* Update first_steps.rst

* Point Quickstart to minigallery example

* Update docs/source/first_steps.rst

* Update plot_dem_subtraction.py example

* Update docs/source/first_steps.rst

* Renamed first steps into quick start

* Update plot_dem_subtraction.py text

* Update coregistration.py example to follow latest dev

* Small update on previous commit

* Update .rst according to previous commits

* Minor update to comparison.py example and associated rst

* Update plot_nuth_kaab example

* Showcase how to plot outlines

* Update plot_nuth_kaab example

* Update plot_terrain_attributes example

* Update ICP example

* Update blockwise coreg example

* Update norm regional hypso example

* Easy fixes following review comments

* Modify section 'About xdem' and rename sections

* Fix plot failure due to inconsistent Raster or array object

* Fix FutureWarning on array indexing

* Add temp file of saving example to gitignore

* Homogenize usage of xDEM and existing TODOs

* Change objective to modularity to match term of OGGM, PyTorch, etc

* Get geoutils in development

* Fix to get geoutils directly from git

Co-authored-by: Amaury Dehecq <[email protected]>
Co-authored-by: adehecq <[email protected]>
  • Loading branch information
3 people authored May 25, 2022
1 parent beeaf86 commit c809245
Show file tree
Hide file tree
Showing 23 changed files with 312 additions and 150 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -143,3 +143,4 @@ examples/*/data/
auto_examples/
gen_modules/
examples/*/processed
examples/temp.tif
5 changes: 4 additions & 1 deletion dev-environment.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@ dependencies:
- proj-data
- scikit-gstat>=0.6.8
- pytransform3d
- geoutils>=0.0.5

# Development-specific
- pytest
Expand All @@ -29,3 +28,7 @@ dependencies:
- flake8
- sphinx-autodoc-typehints
- sphinx-gallery

- pip:
- git+https://github.com/GlacioHack/GeoUtils.git

17 changes: 17 additions & 0 deletions docs/source/about_xdem.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
.. _about_xdem:

About xDEM
==========

xDEM is a set of open-source Python tools to facilitate the postprocessing of DEMs, and more general with georeferenced rasters. It is designed by geoscientists for geoscientists, although currently our group has a strong focus on glaciological applications.

We are not software developers, but we try our best to offer tools that can be useful to a larger group, are well documented and are reliable and maintained. All develpment and maintenance is made on a voluntary basis and we welcome any new contributors. See some information on how to contribute in the dedicated page of our `GitHub repository <https://github.com/GlacioHack/xdem/blob/main/CONTRIBUTING.md>`_.

The core concepts behind *xdem* are:

**Ease of use**: Most operations require only a few lines of code. The module was developed with a focus on remote sensing and geoscience applications.

**Modularity**: We offer a set of options, rather than favoring a single method. Everyone should be able to contribute to xdem and add to its functionalities.

**Reproducibility**: Version-controlled, releases saved with DOI and test-based development ensure our code always performs as expected.

4 changes: 2 additions & 2 deletions docs/source/biascorr.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,9 +8,9 @@ Bias corrections correspond to transformations that cannot be described as a 3-d
Directional biases
------------------

TODO
TODO: In construction

Terrain biases
--------------

TODO
TODO: In construction
5 changes: 3 additions & 2 deletions docs/source/code/comparison.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,9 +55,10 @@
)

# The example DEMs are void-free, so let's make some random voids.
ddem.data.mask = np.zeros_like(ddem.data, dtype=bool) # Reset the mask
# Introduce 50000 nans randomly throughout the dDEM.
ddem.data.mask.ravel()[np.random.choice(ddem.data.size, 50000, replace=False)] = True
mask = np.zeros_like(ddem.data, dtype=bool)
mask.ravel()[(np.random.choice(ddem.data.size, 50000, replace=False))] = True
ddem.set_mask(mask)

# SUBSECTION: Linear spatial interpolation

Expand Down
26 changes: 11 additions & 15 deletions docs/source/code/coregistration.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,29 +11,25 @@

# Load the data using xdem and geoutils (could be with rasterio and geopandas instead)
# Load a reference DEM from 2009
reference_dem = xdem.DEM(xdem.examples.get_path("longyearbyen_ref_dem"))
ref_dem = xdem.DEM(xdem.examples.get_path("longyearbyen_ref_dem"))
# Load a moderately well aligned DEM from 1990
dem_to_be_aligned = xdem.DEM(xdem.examples.get_path("longyearbyen_tba_dem")).reproject(reference_dem, silent=True)
tba_dem = xdem.DEM(xdem.examples.get_path("longyearbyen_tba_dem")).reproject(ref_dem, silent=True)
# Load glacier outlines from 1990. This will act as the unstable ground.
glacier_outlines = gu.Vector(xdem.examples.get_path("longyearbyen_glacier_outlines"))

# Prepare the inputs for coregistration.
ref_data = reference_dem.data.squeeze() # This is a numpy 2D array/masked_array
tba_data = dem_to_be_aligned.data.squeeze() # This is a numpy 2D array/masked_array
# This is a boolean numpy 2D array. Note the bitwise not (~) symbol
inlier_mask = ~glacier_outlines.create_mask(reference_dem)
transform = reference_dem.transform # This is a rio.transform.Affine object.
inlier_mask = ~glacier_outlines.create_mask(ref_dem)

########################
# SECTION: Nuth and Kääb
########################

nuth_kaab = coreg.NuthKaab()
# Fit the data to a suitable x/y/z offset.
nuth_kaab.fit(ref_data, tba_data, transform=transform, inlier_mask=inlier_mask)
nuth_kaab.fit(ref_dem, tba_dem, inlier_mask=inlier_mask)

# Apply the transformation to the data (or any other data)
aligned_dem = nuth_kaab.apply(tba_data, transform=transform)
aligned_dem = nuth_kaab.apply(tba_dem)

####################
# SECTION: Deramping
Expand All @@ -42,21 +38,21 @@
# Instantiate a 1st order deramping object.
deramp = coreg.Deramp(degree=1)
# Fit the data to a suitable polynomial solution.
deramp.fit(ref_data, tba_data, transform=transform, inlier_mask=inlier_mask)
deramp.fit(ref_dem, tba_dem, inlier_mask=inlier_mask)

# Apply the transformation to the data (or any other data)
deramped_dem = deramp.apply(dem_to_be_aligned.data, transform=dem_to_be_aligned.transform)
deramped_dem = deramp.apply(tba_dem)

##########################
# SECTION: Bias correction
##########################

bias_corr = coreg.BiasCorr()
# Note that the transform argument is not needed, since it is a simple vertical correction.
bias_corr.fit(ref_data, tba_data, inlier_mask=inlier_mask, transform=reference_dem.transform)
bias_corr.fit(ref_dem, tba_dem, inlier_mask=inlier_mask)

# Apply the bias to a DEM
corrected_dem = bias_corr.apply(tba_data, transform=dem_to_be_aligned.transform)
corrected_dem = bias_corr.apply(tba_dem)

# Use median bias instead
bias_median = coreg.BiasCorr(bias_func=np.median)
Expand All @@ -70,10 +66,10 @@
# Instantiate the object with default parameters
icp = coreg.ICP()
# Fit the data to a suitable transformation.
icp.fit(ref_data, tba_data, transform=transform, inlier_mask=inlier_mask)
icp.fit(ref_dem, tba_dem, inlier_mask=inlier_mask)

# Apply the transformation matrix to the data (or any other data)
aligned_dem = icp.apply(tba_data, transform=transform)
aligned_dem = icp.apply(tba_dem)

###################
# SECTION: Pipeline
Expand Down
12 changes: 6 additions & 6 deletions docs/source/comparison.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ dDEM interpolation
------------------
There are many approaches to interpolate a dDEM.
A good comparison study for glaciers is McNabb et al., (`2019 <https://doi.org/10.5194/tc-13-895-2019>`_).
So far, ``xdem`` has three types of interpolation:
So far, xDEM has three types of interpolation:

- Linear spatial interpolation
- Local hypsometric interpolation
Expand All @@ -26,7 +26,7 @@ So far, ``xdem`` has three types of interpolation:
Let's first create a :class:`xdem.ddem.dDEM` object to experiment on:

.. literalinclude:: code/comparison.py
:lines: 51-60
:lines: 51-61


Linear spatial interpolation
Expand All @@ -35,7 +35,7 @@ Linear spatial interpolation (also often called bilinear interpolation) of dDEMs


.. literalinclude:: code/comparison.py
:lines: 64
:lines: 65

.. plot:: code/comparison_plot_spatial_interpolation.py

Expand All @@ -51,7 +51,7 @@ Then, voids are interpolated by replacing them with what "should be there" at th


.. literalinclude:: code/comparison.py
:lines: 68
:lines: 69

.. plot:: code/comparison_plot_local_hypsometric_interpolation.py

Expand All @@ -67,7 +67,7 @@ This is advantageous in respect to areas where voids are frequent, as not even a
Of course, the accuracy of such an averaging is much lower than if the local hypsometric approach is used (assuming it is possible).

.. literalinclude:: code/comparison.py
:lines: 72
:lines: 73

.. plot:: code/comparison_plot_regional_hypsometric_interpolation.py

Expand All @@ -76,7 +76,7 @@ Of course, the accuracy of such an averaging is much lower than if the local hyp

The DEMCollection object
------------------------
Keeping track of multiple DEMs can be difficult when many different extents, resolutions and CRSs are involved, and :class:`xdem.demcollection.DEMCollection` is ``xdem``'s answer to make this simple.
Keeping track of multiple DEMs can be difficult when many different extents, resolutions and CRSs are involved, and :class:`xdem.demcollection.DEMCollection` is xDEM's answer to make this simple.
We need metadata on the timing of these products.
The DEMs can be provided with the ``datetime=`` argument upon instantiation, or the attribute could be set later.
Multiple outlines are provided as a dictionary in the shape of ``{datetime: outline}``.
Expand Down
24 changes: 12 additions & 12 deletions docs/source/coregistration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -36,13 +36,13 @@ Examples are given using data close to Longyearbyen on Svalbard. These can be lo


.. literalinclude:: code/coregistration.py
:lines: 5-27
:lines: 5-21

The Coreg object
----------------
:class:`xdem.coreg.Coreg`

Each of the coregistration approaches in ``xdem`` inherit their interface from the ``Coreg`` class.
Each of the coregistration approaches in xDEM inherit their interface from the ``Coreg`` class.
It is written in a style that should resemble that of ``scikit-learn`` (see their `LinearRegression <https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LinearRegression.html#sklearn-linear-model-linearregression>`_ class for example).
Each coregistration approach has the methods:

Expand All @@ -53,6 +53,8 @@ Each coregistration approach has the methods:

First, ``.fit()`` is called to estimate the transform, and then this transform can be used or exported using the subsequent methods.

**Figure illustrating the different coregistration methods implemented**

.. inheritance-diagram:: xdem.coreg
:top-classes: xdem.coreg.Coreg

Expand Down Expand Up @@ -90,7 +92,7 @@ Example
^^^^^^^

.. literalinclude:: code/coregistration.py
:lines: 31-36
:lines: 27-32


.. minigallery:: xdem.coreg.NuthKaab
Expand All @@ -100,13 +102,13 @@ Deramping
---------
:class:`xdem.coreg.Deramp`

- **Performs:** Bias, linear or nonlinear height corrections.
- **Performs:** Bias, linear or nonlinear vertical corrections.
- **Supports weights** (soon)
- **Recommended for:** Data with no horizontal offset and low to moderate rotational differences.

Deramping works by estimating and correcting for an N-degree polynomial over the entire dDEM between a reference and the DEM to be aligned.
This may be useful for correcting small rotations in the dataset, or nonlinear errors that for example often occur in structure-from-motion derived optical DEMs (e.g. Rosnell and Honkavaara `2012 <https://doi.org/10.3390/s120100453>`_; Javernick et al. `2014 <https://doi.org/10.1016/j.geomorph.2014.01.006>`_; Girod et al. `2017 <https://doi.org/10.5194/tc-11827-2017>`_).
Applying a "0 degree deramping" is equivalent to a simple bias correction, and is recommended for e.g. vertical datum corrections.
Applying a "0 degree deramping" is equivalent to a simple bias correction.

Limitations
^^^^^^^^^^^
Expand All @@ -119,7 +121,7 @@ Example
^^^^^^^

.. literalinclude:: code/coregistration.py
:lines: 42-48
:lines: 38-44


Bias correction
Expand All @@ -141,7 +143,7 @@ Only performs vertical corrections, so it should be combined with another approa
Example
^^^^^^^
.. literalinclude:: code/coregistration.py
:lines: 54-64
:lines: 50-60

ICP
---
Expand Down Expand Up @@ -171,7 +173,7 @@ Due to the repeated nearest neighbour calculations, ICP is often the slowest cor
Example
^^^^^^^
.. literalinclude:: code/coregistration.py
:lines: 70-76
:lines: 66-72

.. minigallery:: xdem.coreg.ICP
:add-heading:
Expand All @@ -184,7 +186,7 @@ Often, more than one coregistration approach is necessary to obtain the best res
For example, ICP works poorly with large initial biases, so a ``CoregPipeline`` can be constructed to perform both sequentially:

.. literalinclude:: code/coregistration.py
:lines: 82-87
:lines: 78-83

The ``CoregPipeline`` object exposes the same interface as the ``Coreg`` object.
The results of a pipeline can be used in other programs by exporting the combined transformation matrix:
Expand Down Expand Up @@ -221,6 +223,4 @@ For large biases, rotations and high amounts of noise:

.. code-block:: python
coreg.VerticalShift() + coreg.ICP() + coreg.NuthKaab()
coreg.BiasCorr() + coreg.ICP() + coreg.NuthKaab()
2 changes: 1 addition & 1 deletion docs/source/filters.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,4 @@
Filtering
=========

In construction
TODO: In construction
60 changes: 60 additions & 0 deletions docs/source/how_to_install.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
.. _how_to_install:

How to install
============

Installing with conda (recommended)
------------------------
.. code-block:: bash
conda install -c conda-forge --strict-channel-priority xdem
**Notes**

- The ``--strict-channel-priority`` flag seems essential for Windows installs to function correctly, and is recommended for UNIX-based systems as well.

- Solving dependencies can take a long time with ``conda``. To speed up this, consider installing ``mamba``:

.. code-block:: bash
conda install mamba -n base -c conda-forge
Once installed, the same commands can be run by simply replacing ``conda`` by ``mamba``. More details available through the `mamba project <https://github.com/mamba-org/mamba>`_.

- If running into the ``sklearn`` error ``ImportError: dlopen: cannot load any more object with static TLS``, your system
needs to update its ``glibc`` (see details `here <https://github.com/scikit-learn/scikit-learn/issues/14485#issuecomment-822678559>`_).
If you have no administrator right on the system, you might be able to circumvent this issue by installing a working
environment with specific downgraded versions of ``scikit-learn`` and ``numpy``:

.. code-block:: bash
conda create -n xdem-env -c conda-forge xdem scikit-learn==0.20.3 numpy==1.19.*
On very old systems, if the above install results in segmentation faults, try setting more specifically
``numpy==1.19.2=py37h54aff64_0`` (worked with Debian 8.11, GLIBC 2.19).

Installing with pip
-------------------

.. code-block:: bash
pip install xdem
**NOTE**: Setting up GDAL and PROJ may need some extra steps, depending on your operating system and configuration.


Installing for contributors
---------------------------
Recommended: Use conda for dependency solving.

.. code-block:: shell
git clone https://github.com/GlacioHack/xdem.git
cd ./xdem
conda env create -f dev-environment.yml
conda activate xdem
pip install -e .
After installing, we recommend to check that everything is working by running the tests:

``pytest -rA``
Loading

0 comments on commit c809245

Please sign in to comment.