Skip to content

Commit

Permalink
🔀 Merge branch 'geo_to_spatiotemporal' (#40)
Browse files Browse the repository at this point in the history
Closes #40 Reusable SpatioTemporal functions.
  • Loading branch information
weiji14 committed May 30, 2020
2 parents e6ada22 + d7ee65a commit 69174ad
Show file tree
Hide file tree
Showing 20 changed files with 402 additions and 253 deletions.
4 changes: 2 additions & 2 deletions .github/workflows/python-app.yml
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ jobs:
with:
path: |
/usr/share/miniconda3/envs/deepicedrain
key: cache-venv-${{ github.ref }}-${{ hashFiles('**/environment.yml') }}-${{ hashFiles('**/poetry.lock') }}
key: cache-venv-${{ github.ref }}-${{ hashFiles('**/environment.yml') }}-${{ hashFiles('**/poetry.lock') }}-${{ hashFiles('**/deepicedrain/*.py') }}
restore-keys: |
cache-venv-refs/heads/master-
Expand Down Expand Up @@ -66,4 +66,4 @@ jobs:

- name: Test with pytest
shell: bash -l {0}
run: poetry run pytest --verbose tests/
run: poetry run pytest --verbose deepicedrain/
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -26,3 +26,6 @@ MANIFEST

# Jupyter Notebook
.ipynb_checkpoints

# Data files
**/*.h5
35 changes: 35 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@ in Antarctica using remote sensing and machine learning.

![ATL11 Cycle 6 minus Cycle 5 height change over Antarctica](https://user-images.githubusercontent.com/23487320/83100017-ffb0ba00-a102-11ea-9603-ac469f09e58b.png)

![DeepIceDrain Pipeline](https://yuml.me/diagram/scruffy;dir:LR/class/[Land-Ice-Elevation|atl06_play.ipynb]->[Convert|atl06_to_atl11.ipynb],[Convert]->[Ice-Sheet-H(t)-Series|atl11_play.ipynb])

# Getting started

## Quickstart
Expand Down Expand Up @@ -66,3 +68,36 @@ Finally, double-check that the libraries have been installed.
python -m ipykernel install --user --name deepicedrain # to install conda env properly
jupyter kernelspec list --json # see if kernel is installed
jupyter lab &

## Usage

Once you've installed properly installed the `deepicedrain` package,
you can use it to do some quick calculations on ICESat-2 datasets.
The example below shows how to calculate ice surface elevation change
on a sample ATL11 dataset between ICESat's Cycle 3 and Cycle 4.

import deepicedrain
import xarray as xr

# Loads a sample ATL11 file from the intake catalog into xarray
atl11_dataset: xr.Dataset = deepicedrain.catalog.test_data.atl11_test_case.read()

# Calculate elevation change in metres from ICESat-2 Cycle 3 to Cycle 4
delta_height: xr.DataArray = deepicedrain.calculate_delta(
dataset=atl11_dataset, oldcyclenum=3, newcyclenum=4, variable="h_corr"
)

# Quick plot of delta_height along the ICESat-2 track
delta_height.plot()

![ATL11 delta_height along ref_pt track](https://user-images.githubusercontent.com/23487320/83319030-bf7e4280-a28e-11ea-9bed-331e35dbc266.png)

## Related Projects

This work would not be possible without inspiration
from the following cool open source projects!
Go check them out if you have time.

- [ATL11](https://github.com/suzanne64/ATL11)
- [ICESAT-2 HackWeek](https://github.com/ICESAT-2HackWeek)
- [icepyx](https://github.com/icesat2py/icepyx)
41 changes: 12 additions & 29 deletions atl06_play.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
"import tqdm\n",
"import xarray as xr\n",
"\n",
"# %matplotlib inline"
"import deepicedrain"
]
},
{
Expand Down Expand Up @@ -109,23 +109,19 @@
"\n",
"Use our [intake catalog](https://intake.readthedocs.io/en/latest/catalog.html) to get some sample ATL06 data\n",
"(while making sure we have our Earthdata credentials set up properly),\n",
"and view it using [xarray](https://xarray.pydata.org) and [hvplot](https://hvplot.pyviz.org)."
"and view it using [xarray](https://xarray.pydata.org) and [hvplot](https://hvplot.pyviz.org).\n",
"\n",
"open the local intake data catalog file containing ICESat-2 stuff\n",
"catalog = intake.open_catalog(\"deepicedrain/atlas_catalog.yaml\")\n",
"or if the deepicedrain python package is installed, you can use either of the below:\n",
"catalog = deepicedrain.catalog\n",
"catalog = intake.cat.atlas_cat"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {},
"outputs": [],
"source": [
"# open the local catalog file containing ICESat-2 stuff\n",
"catalog = intake.open_catalog(uri=\"catalog.yaml\")"
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {},
"outputs": [
{
"data": {
Expand Down Expand Up @@ -1035,7 +1031,7 @@
" data_rate: Data within this group are sparse. Data values are provide..."
]
},
"execution_count": 4,
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
Expand All @@ -1051,7 +1047,7 @@
" )\n",
" raise\n",
"\n",
"# depends on .netrc file in home folder\n",
"# data download will depend on having a .netrc file in home folder\n",
"dataset = catalog.icesat2atl06.to_dask().unify_chunks()\n",
"dataset"
]
Expand Down Expand Up @@ -1752,21 +1748,8 @@
"metadata": {},
"outputs": [],
"source": [
"transformer = pyproj.Transformer.from_crs(\n",
" crs_from=pyproj.CRS.from_epsg(4326),\n",
" crs_to=pyproj.CRS.from_epsg(3031),\n",
" always_xy=True,\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 21,
"metadata": {},
"outputs": [],
"source": [
"dfs[\"x\"], dfs[\"y\"] = transformer.transform(\n",
" xx=dfs.longitude.values, yy=dfs.latitude.values\n",
"dfs[\"x\"], dfs[\"y\"] = deepicedrain.lonlat_to_xy(\n",
" longitude=dfs.longitude, latitude=dfs.latitude\n",
")"
]
},
Expand Down
23 changes: 9 additions & 14 deletions atl06_play.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@
import tqdm
import xarray as xr

# %matplotlib inline
import deepicedrain

# %%
# Configure intake and set number of compute cores for data download
Expand All @@ -73,9 +73,11 @@
# (while making sure we have our Earthdata credentials set up properly),
# and view it using [xarray](https://xarray.pydata.org) and [hvplot](https://hvplot.pyviz.org).

# %%
# open the local catalog file containing ICESat-2 stuff
catalog = intake.open_catalog(uri="catalog.yaml")
# open the local intake data catalog file containing ICESat-2 stuff
catalog = intake.open_catalog("deepicedrain/atlas_catalog.yaml")
# or if the deepicedrain python package is installed, you can use either of the below:
# catalog = deepicedrain.catalog
# catalog = intake.cat.atlas_cat

# %%
try:
Expand All @@ -88,7 +90,7 @@
)
raise

# depends on .netrc file in home folder
# data download will depend on having a .netrc file in home folder
dataset = catalog.icesat2atl06.to_dask().unify_chunks()
dataset

Expand Down Expand Up @@ -345,15 +347,8 @@ def six_laser_beams(filepaths: list) -> dask.dataframe.DataFrame:
# ### Transform from EPSG:4326 (lat/lon) to EPSG:3031 (Antarctic Polar Stereographic)

# %%
transformer = pyproj.Transformer.from_crs(
crs_from=pyproj.CRS.from_epsg(4326),
crs_to=pyproj.CRS.from_epsg(3031),
always_xy=True,
)

# %%
dfs["x"], dfs["y"] = transformer.transform(
xx=dfs.longitude.values, yy=dfs.latitude.values
dfs["x"], dfs["y"] = deepicedrain.lonlat_to_xy(
longitude=dfs.longitude, latitude=dfs.latitude
)

# %%
Expand Down
47 changes: 12 additions & 35 deletions atl11_play.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,6 @@
"import glob\n",
"\n",
"import deepicedrain\n",
"import pointCollection.is2_calendar\n",
"\n",
"import dask\n",
"import dask.array\n",
Expand Down Expand Up @@ -160,24 +159,11 @@
},
"outputs": [],
"source": [
"lonlat_to_xy = lambda longitude, latitude: pyproj.Proj(projparams=3031)(\n",
" longitude, latitude\n",
"ds[\"x\"], ds[\"y\"] = deepicedrain.lonlat_to_xy(\n",
" longitude=ds.longitude, latitude=ds.latitude\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 6,
"metadata": {
"lines_to_next_cell": 2
},
"outputs": [],
"source": [
"x, y = lonlat_to_xy(ds.longitude.values, ds.latitude.values)\n",
"ds[\"x\"] = xr.DataArray(data=x, coords=ds.longitude.coords)\n",
"ds[\"y\"] = xr.DataArray(data=y, coords=ds.latitude.coords)"
]
},
{
"cell_type": "code",
"execution_count": 7,
Expand Down Expand Up @@ -214,18 +200,7 @@
"metadata": {},
"outputs": [],
"source": [
"ICESAT2_EPOCH = np.datetime64(pointCollection.is2_calendar.t_0())\n",
"# ICESAT2_EPOCH = np.datetime64(datetime.datetime(2018, 1, 1, 0, 0, 0))"
]
},
{
"cell_type": "code",
"execution_count": 9,
"metadata": {},
"outputs": [],
"source": [
"utc_time = dask.array.asarray(ICESAT2_EPOCH) + ds.delta_time.data\n",
"ds[\"utc_time\"] = xr.DataArray(data=utc_time, coords=ds.delta_time.coords)"
"ds[\"utc_time\"] = deepicedrain.deltatime_to_utctime(dataarray=ds.delta_time)"
]
},
{
Expand Down Expand Up @@ -277,19 +252,21 @@
"source": [
"# Dictionary of Antarctic bounding box locations with EPSG:3031 coordinates\n",
"regions = {\n",
" \"kamb\": deepicedrain.BBox(\n",
" \"kamb\": deepicedrain.Region(\n",
" name=\"Kamb Ice Stream\",\n",
" xmin=-739741.7702261859,\n",
" xmax=-411054.19240523444,\n",
" ymin=-699564.516934089,\n",
" ymax=-365489.6822096751,\n",
" ),\n",
" \"antarctica\": deepicedrain.BBox(\"Antarctica\", -2700000, 2800000, -2200000, 2300000),\n",
" \"siple_coast\": deepicedrain.BBox(\n",
" \"antarctica\": deepicedrain.Region(\n",
" \"Antarctica\", -2700000, 2800000, -2200000, 2300000\n",
" ),\n",
" \"siple_coast\": deepicedrain.Region(\n",
" \"Siple Coast\", -1000000, 250000, -1000000, -100000\n",
" ),\n",
" \"kamb2\": deepicedrain.BBox(\"Kamb Ice Stream\", -500000, -400000, -600000, -500000),\n",
" \"whillans\": deepicedrain.BBox(\n",
" \"kamb2\": deepicedrain.Region(\"Kamb Ice Stream\", -500000, -400000, -600000, -500000),\n",
" \"whillans\": deepicedrain.Region(\n",
" \"Whillans Ice Stream\", -350000, -100000, -700000, -450000\n",
" ),\n",
"}"
Expand All @@ -303,7 +280,7 @@
"source": [
"# Do the actual computation to find data points within region of interest\n",
"region = regions[\"kamb\"] # Select Kamb Ice Stream region\n",
"ds_subset = ds.where(cond=region.subset(ds=ds), drop=True)\n",
"ds_subset = region.subset(ds=ds)\n",
"ds_subset = ds_subset.unify_chunks()\n",
"ds_subset = ds_subset.compute()"
]
Expand Down Expand Up @@ -671,7 +648,7 @@
"source": [
"# Select region here, see dictionary of regions at top\n",
"placename: str = \"antarctica\"\n",
"region: deepicedrain.BBox = regions[placename]"
"region: deepicedrain.Region = regions[placename]"
]
},
{
Expand Down
35 changes: 12 additions & 23 deletions atl11_play.py
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,6 @@
import glob

import deepicedrain
import pointCollection.is2_calendar

import dask
import dask.array
Expand Down Expand Up @@ -87,17 +86,10 @@
# to the Antarctic Polar Stereographic (EPSG:3031) projection.

# %%
lonlat_to_xy = lambda longitude, latitude: pyproj.Proj(projparams=3031)(
longitude, latitude
ds["x"], ds["y"] = deepicedrain.lonlat_to_xy(
longitude=ds.longitude, latitude=ds.latitude
)


# %%
x, y = lonlat_to_xy(ds.longitude.values, ds.latitude.values)
ds["x"] = xr.DataArray(data=x, coords=ds.longitude.coords)
ds["y"] = xr.DataArray(data=y, coords=ds.latitude.coords)


# %%
# Also set x, y as coordinates in xarray.Dataset
ds = ds.set_coords(names=["x", "y"])
Expand All @@ -118,12 +110,7 @@
# in the future.

# %%
ICESAT2_EPOCH = np.datetime64(pointCollection.is2_calendar.t_0())
# ICESAT2_EPOCH = np.datetime64(datetime.datetime(2018, 1, 1, 0, 0, 0))

# %%
utc_time = dask.array.asarray(ICESAT2_EPOCH) + ds.delta_time.data
ds["utc_time"] = xr.DataArray(data=utc_time, coords=ds.delta_time.coords)
ds["utc_time"] = deepicedrain.deltatime_to_utctime(dataarray=ds.delta_time)

# %% [markdown]
# ## Mask out low quality height data
Expand All @@ -148,27 +135,29 @@
# %%
# Dictionary of Antarctic bounding box locations with EPSG:3031 coordinates
regions = {
"kamb": deepicedrain.BBox(
"kamb": deepicedrain.Region(
name="Kamb Ice Stream",
xmin=-739741.7702261859,
xmax=-411054.19240523444,
ymin=-699564.516934089,
ymax=-365489.6822096751,
),
"antarctica": deepicedrain.BBox("Antarctica", -2700000, 2800000, -2200000, 2300000),
"siple_coast": deepicedrain.BBox(
"antarctica": deepicedrain.Region(
"Antarctica", -2700000, 2800000, -2200000, 2300000
),
"siple_coast": deepicedrain.Region(
"Siple Coast", -1000000, 250000, -1000000, -100000
),
"kamb2": deepicedrain.BBox("Kamb Ice Stream", -500000, -400000, -600000, -500000),
"whillans": deepicedrain.BBox(
"kamb2": deepicedrain.Region("Kamb Ice Stream", -500000, -400000, -600000, -500000),
"whillans": deepicedrain.Region(
"Whillans Ice Stream", -350000, -100000, -700000, -450000
),
}

# %%
# Do the actual computation to find data points within region of interest
region = regions["kamb"] # Select Kamb Ice Stream region
ds_subset = ds.where(cond=region.subset(ds=ds), drop=True)
ds_subset = region.subset(ds=ds)
ds_subset = ds_subset.unify_chunks()
ds_subset = ds_subset.compute()

Expand Down Expand Up @@ -317,7 +306,7 @@
# %%
# Select region here, see dictionary of regions at top
placename: str = "antarctica"
region: deepicedrain.BBox = regions[placename]
region: deepicedrain.Region = regions[placename]

# %%
# Find subglacial lakes (Smith et al., 2009) within region of interest
Expand Down
Loading

0 comments on commit 69174ad

Please sign in to comment.