Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix import errors, add solution for TLS issue on older systems #200

Merged
merged 14 commits into from
Aug 11, 2021
17 changes: 17 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,23 @@ conda install -c conda-forge --strict-channel-priority xdem
```
The `--strict-channel-priority` flag seems essential for Windows installs to function correctly, and is recommended for UNIX-based systems as well.

Solving dependencies can take a long time with `conda`. To speed up this, consider installing `mamba`:
rhugonnet marked this conversation as resolved.
Show resolved Hide resolved

```bash
conda install mamba -n base -c conda-forge
```

Once installed, the same commands can be run by simply replacing `conda` by `mamba`. More details available through the [mamba project](https://github.com/mamba-org/mamba).

If running into the `sklearn` error `ImportError: dlopen: cannot load any more object with static TLS`, your system
needs to update its `glibc` (see details [here](https://github.com/scikit-learn/scikit-learn/issues/14485#issuecomment-822678559)).
If you have no administrator right on the system, you might be able to circumvent this issue by installing a working
environment with specific downgraded versions of `scikit-learn` and `numpy`:
```bash
conda create -n xdem-env -c conda-forge xdem scikit-learn==0.20.3 numpy==1.19.*
```
On very old systems, if the above install results in segmentation faults, try setting more specifically
`numpy==1.19.2=py37h54aff64_0` (worked with Debian 8.11, GLIBC 2.19).

### Installing with pip
**NOTE**: Setting up GDAL and PROJ may need some extra steps, depending on your operating system and configuration.
Expand Down
17 changes: 11 additions & 6 deletions xdem/coreg.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,11 @@
from enum import Enum
from typing import Any, Callable, Optional, overload, Union, Sequence, TypeVar

try:
import cv2
_has_cv2 = True
except ImportError:
_has_cv2 = False
import fiona
import geoutils as gu
from geoutils.georaster import RasterType
Expand All @@ -35,12 +40,6 @@
except ImportError:
_has_rd = False

try:
import cv2
_has_cv2 = True
except ImportError:
_has_cv2 = False

try:
from pytransform3d.transform_manager import TransformManager
import pytransform3d.transformations
Expand Down Expand Up @@ -917,9 +916,11 @@ def __init__(self, max_iterations=100, tolerance=0.05, rejection_scale=2.5, num_
def _fit_func(self, ref_dem: np.ndarray, tba_dem: np.ndarray, transform: Optional[rio.transform.Affine],
weights: Optional[np.ndarray], verbose: bool = False):
"""Estimate the rigid transform from tba_dem to ref_dem."""

if weights is not None:
warnings.warn("ICP was given weights, but does not support it.")


bounds, resolution = _transform_to_bounds_and_res(ref_dem.shape, transform)
points: dict[str, np.ndarray] = {}
# Generate the x and y coordinates for the reference_dem
Expand Down Expand Up @@ -1392,6 +1393,10 @@ def apply_matrix(dem: np.ndarray, transform: rio.transform.Affine, matrix: np.nd
if np.mean(np.abs(empty_matrix - matrix)) == 0.0:
return demc + matrix[2, 3]

# Opencv is required down from here
if not _has_cv2:
raise ValueError("Optional dependency needed. Install 'opencv'")

nan_mask = xdem.spatial_tools.get_mask(dem)
assert np.count_nonzero(~nan_mask) > 0, "Given DEM had all nans."
# Create a filled version of the DEM. (skimage doesn't like nans)
Expand Down
11 changes: 10 additions & 1 deletion xdem/misc.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,12 @@
import warnings
from typing import Any, Callable

import cv2
try:
import cv2
_has_cv2 = True
except ImportError:
_has_cv2 = False

import numpy as np

import xdem.version
Expand All @@ -28,6 +33,10 @@ def generate_random_field(shape: tuple[int, int], corr_size: int) -> np.ndarray:

:returns: A numpy array of semi-random values from 0 to 1
"""

if not _has_cv2:
raise ValueError("Optional dependency needed. Install 'opencv'")

field = cv2.resize(
cv2.GaussianBlur(
np.repeat(
Expand Down