Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ported to python3 #5

Open
wants to merge 4 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
159 changes: 159 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,162 @@
cpxlooks
look_double
rilooks

*.pyc
__pycache__
settings.conf

# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class

# C extensions
*.so

# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST

# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec

# Installer logs
pip-log.txt
pip-delete-this-directory.txt

# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
.hypothesis/
.pytest_cache/

# Translations
*.mo
*.pot

# Django stuff:
*.log
local_settings.py
db.sqlite3

# Flask stuff:
instance/
.webassets-cache

# Scrapy stuff:
.scrapy

# Sphinx documentation
docs/_build/

# PyBuilder
target/

# Jupyter Notebook
.ipynb_checkpoints

# IPython
profile_default/
ipython_config.py

# pyenv
.python-version

# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don’t work, or not
# install all needed dependencies.
#Pipfile.lock

# celery beat schedule file
celerybeat-schedule

# SageMath parsed files
*.sage.py

# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/

# Spyder project settings
.spyderproject
.spyproject

# Rope project settings
.ropeproject

# mkdocs documentation
/site

# mypy
.mypy_cache/
.dmypy.json
dmypy.json

# Pyre type checker
.pyre/

# General
.DS_Store
.AppleDouble
.LSOverride

# Icon must end with two \r
Icon


# Thumbnails
._*

# Files that might appear in the root of a volume
.DocumentRevisions-V100
.fseventsd
.Spotlight-V100
.TemporaryItems
.Trashes
.VolumeIcon.icns
.com.apple.timemachine.donotpresent

# Directories potentially created on remote AFP share
.AppleDB
.AppleDesktop
Network Trash Folder
Temporary Items
.apdisk

.idea/

62 changes: 57 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,66 @@
# slcp2pm - SLC Pair to Proxy Map
### slcp2pm (LAR)

## How to create log amplitude ratio images from ARIA SLC_PAIR products
#### SLC Pair to Log Amplitude Ratio

1. go to src
This repository contains jobs to convert S1-SLCP products to Log Amplitude Ratios (LARs) for creating Flood Proxy Maps v1 (FPMv1).

A LAR is created based on the below pairing system, using 2 SLCs --> 1 SLCP --> 1 FPM:
![image](https://user-images.githubusercontent.com/6346909/77985672-9f6cea00-7304-11ea-807d-861833d4b1a3.png)

_Note: Since SLCP:LAR is 1:1, the pairing of SLCPs is **not required** to make LARs. Hence, user simply needs to facet on target SLCPs to execute the LAR job on._


### Job 1: S1 Log Amplitude Ratio
- Type: **Iterative**
- Facet: **SLCPs to create LARs from**
- User inputs:

| Fields | Description | Type |Example |
| ------------- |-------------| :---------:| :---------:|
| `lar_range_looks` | Range looks to create LARs. (Overrides looks in SLCP's metadata)| int | 7 |
| `lar_azimuth_looks` | Azimuth looks to create LARs. (Overrides looks in SLCP's metadata) | int | 2 |

- Important outputs:

| Product | Description | Example |
| ------------- |-------------| :-----|
| Log Amplitude Ratio Products | Geocoded, multilooked LARs | logr_[burst]_[range_lks]_[az_lks].float.geo|
| Amplitude Products | Geocoded, multilooked amplitudes stored in 2 bands. Band 1 - Master scene's amp. Band 2 - Slave scene's amp. | amp_[subswath]_[range_lks]_[az_lks].amp.geo|


##### Notes on S1 Log Amplitude Ratio
The LARs in this PGE are computed as such (from `log_ratio.py`):

* (**Latest**) From dataset `v2.0` onwards:

![formula](https://render.githubusercontent.com/render/math?math=LAR=\log_{10}{\frac{A_{post-event}}{A_{pre-event}}})
![formula](https://render.githubusercontent.com/render/math?math==\log_{10}{\frac{A_{slave}}{A_{master}}})

where _A_ = Amplitudes of SLCs of the given date in the co-registered SLCPs

=> Negative values / darker pixels correspond to decreased amplitudes in the post-event scene and possible open-water flood.

* Before dataset `v2.0` (`v1.x` etc):

![formula](https://render.githubusercontent.com/render/math?math=LAR=\log_{10}{\frac{A_{pre-event}}{A_{post-event}}})
![formula](https://render.githubusercontent.com/render/math?math==\log_{10}{\frac{A_{master}}{A_{slave}}})

where _A_ = Amplitudes of SLCs of the given date in the co-registered SLCPs

=> Positive values / brighter pixels correspond to decreased amplitudes in the post-event scene and possible open-water flood.


### How to use the raw code for standalone use

**How to create log amplitude ratio images from ARIA SLC_PAIR products:**

1. Go to src.
```
./compile.sh
```
2. set environment variables. see set_env_variable.sh for details
2. Set environment variables. See set_env_variable.sh for details.
3. Then modify slcp2lar_S1.sh and run it.

## Authors
### Authors

* **Sang-Ho Yun** - *initial work*
8 changes: 1 addition & 7 deletions docker/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,13 +1,7 @@
FROM hysds/pge-isce_giant:latest
FROM hysds/isce2:latest

LABEL description="PGE container for LAR product generation"

# create work directory
RUN set -ex \
&& mkdir -p /data/work \
&& chmod -R 755 /data \
&& chown -R ops:ops /data

USER ops

# copy code ensure proper permissions, and move dependencies to final locations
Expand Down
4 changes: 2 additions & 2 deletions docker/hysds-io.json.s1-lar
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,12 @@
{
"name": "localize_url",
"from": "dataset_jpath:_source",
"lambda": "lambda ds: filter(lambda x: x.startswith('s3://'), ds['urls'])[0]"
"lambda": "lambda ds: list(filter(lambda x: x.startswith('s3://'), ds['urls']))[0]"
},
{
"name": "path",
"from": "dataset_jpath:_source",
"lambda": "lambda ds: __import__('os').path.basename(filter(lambda x: x.startswith('s3://'), ds['urls'])[0])"
"lambda": "lambda ds: __import__('os').path.basename(list(filter(lambda x: x.startswith('s3://'), ds['urls']))[0])"
},
{
"name": "lar_range_looks",
Expand Down
2 changes: 2 additions & 0 deletions docker/job-spec.json.s1-lar
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@
"urgent-response-job_worker-large"
],
"disk_usage":"100GB",
"soft_time_limit": 3300,
"time_limit": 3600,
"params" : [
{
"name": "localize_url",
Expand Down
5 changes: 3 additions & 2 deletions script/create_lar.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
#!/usr/bin/env python3
from builtins import str
import os, sys, re, json, shutil, traceback, logging
from subprocess import check_call, check_output
from datetime import datetime
Expand Down Expand Up @@ -89,7 +90,7 @@ def main(slcp_dir):
# get dataset version, set dataset ID and met/dataset JSON files
match = SLCP_RE.search(slcp_id)
if not match:
raise(RuntimeError("Failed to recognize SLCP id: {}".format(slcp_id)))
raise RuntimeError("Failed to recognize SLCP id: {}".format(slcp_id))
id_base = "S1-LAR_{}".format(match.group(1))
swath = match.group(2)
slcp_version = match.group(3)
Expand Down Expand Up @@ -145,7 +146,7 @@ def main(slcp_dir):

if __name__ == '__main__':
try: status = main(sys.argv[1])
except Exception as e:
except (Exception, SystemExit) as e:
with open('_alt_error.txt', 'w') as f:
f.write("%s\n" % str(e))
with open('_alt_traceback.txt', 'w') as f:
Expand Down
18 changes: 1 addition & 17 deletions script/create_lar.sh
Original file line number Diff line number Diff line change
@@ -1,29 +1,13 @@
#!/bin/bash
BASE_PATH=$(dirname "${BASH_SOURCE}")
BASE_PATH=$(cd "${BASE_PATH}"; pwd)

# source ISCE env
export GMT_HOME=/usr/local/gmt
export PYTHONPATH=/usr/local/isce:$PYTHONPATH
export ISCE_HOME=/usr/local/isce/isce
export PATH=$ISCE_HOME/applications:$ISCE_HOME/bin:/usr/local/gdal/bin:$PATH
export LD_LIBRARY_PATH=/usr/local/gdal/lib:$LD_LIBRARY_PATH
export GDAL_DATA=/usr/local/gdal/share/gdal

export TROPMAP_HOME=$HOME/tropmap
export GIANT_HOME=/usr/local/giant/GIAnT
export PYTHONPATH=$ISCE_HOME/applications:$ISCE_HOME/components:$BASE_PATH:$ARIAMH_HOME:$TROPMAP_HOME:$GIANT_HOME:$PYTHONPATH
export PATH=$BASE_PATH:$TROPMAP_HOME:$GMT_HOME/bin:$PATH


SLCP_PROD=$1
SWATH=$2


echo "##########################################" 1>&2
echo -n "Running S1 log amp ratio generation: " 1>&2
date 1>&2
python3 $BASE_PATH/create_lar.py $SLCP_PROD > create_lar.log 2>&1
source /opt/isce2/isce_env.sh && python3 $BASE_PATH/create_lar.py $SLCP_PROD > create_lar.log 2>&1
STATUS=$?
echo -n "Finished running S1 log amp ratio generation: " 1>&2
date 1>&2
Expand Down
1 change: 1 addition & 0 deletions script/geo_with_ll.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
#Cunren Liang, JPL/Caltech


from builtins import range
import os
import sys
import glob
Expand Down
Loading