Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

TODO Madeline Lambert Alex Dunn

This page provides a user-centric overview of the displacement time-series using MintPy processing pipeline. Developer-centric information is on Developer Information.

Use cases

Use Case 1: System continues forward “keep-up” production of S1-GUNW for volcano.

  • User adds an AOI for the system to keep up processing on, and sets the type of AOI as “volcano”enters the enumeration strategy.
    Note to Developer: Method of inputting AOI could be bounding box drawn on a Google Earth map, entering lat/long coordinates, or uploading a vector file.
    Science story: AOIs will encompass volcanic areas to be monitored for anomalies The AOI boundaries should be defined such that they completely fill a track/frame.

  • System automatically pulls ancillary data for AOI (L1 SLCs, orbit data, calibration data, etc.).

  • System automatically processes from the L1 SLC to L2 S1-GUNW.Note to Developer

Use Case 2: System

...

performs on-demand production of

...

MintPy time-

...

series for

...

Use Case 2volcano.

  • User adds a volcano polygon encompassing the area around the volcano, and inputs track number and start/end dates for processing.
    Note to Developer: This is separate from the AOI definition - the AOI will cover a larger area, and the volcano polygon will contain a subset of that area, closer to the volcano. This use case assumes that an AOI has already been defined around the volcanic polygon, and S1-GUNW production is occurring or has completed.

  • System processes MintPy time-series for area defined by polygon, and publishes to S3.

  • User logs in to system to download time-series

Use Case 3: System updates MintPy time-series for volcano.

  • User defines frequency of displacement time-series processing for defined AOI (automatically upon new data acquisition, monthly, every N months, etc.).System adds a volcano polygon encompassing the area around the volcano, and inputs track number and start/end dates for processing.
    Note to Developer: The enumeration strategy for the volcano will be included in the larger AOI definition.

  • System performs forward keep-up production of time-series - either creates new time-series product or updates previous time-series for defined AOI with new acquisition.
    Note to developer: Science team has specified a preference for output of raw displacement time-series (i.e. without atmospheric correction), rather than filtered data, rolling means, or velocity maps.

  • User logs in to system to download time-series.
    Note to Developer: Will need to add facet to allow for user search of displacement time-series.
    Science story: Displacement time-series generation will allow scientists to quickly check on behavior and status of volcanoes of interest.

...

  • The MintPy code still needs to be updated to allow for updating prior time series - for now, this option would have to re-process all old data and create a new time-series product. Once the ability to update a time-series is developed, will have to determine at what frequency to update the time-series.

  • User logs in to system to download time-series.

Use Case 4: Machine learning applied to output of MintPy for potential anomaly detection.

  • System applies ML to detect potential anomalies in displacement time-series, and publishes detected results back to GRQ catalog.

  • User logs into system to browse potential anomalies.Science story: Anomaly detection will inform scientists of possible volcanoes to monitor closely.

System Requirements

Dependencies for ARIA-tools:

Installation instructions for ARIA-tools are outlined here: https://github.com/aria-tools/ARIA-tools#installation

Packages:

Code Block
Python >= 3.5
[PROJ 4](https://github.com/OSGeo/proj) github >= 6.0
[GDAL](https://www.dgal.org/) and its Python bindings >=3.0

Python dependencies:

Code Block
[SciPy](https://scipy.org/)
[netcdf4](http://unidata.github.io/netcdf4-python/netCDF4/index.html)
[requests](https://2.python-requests.org/en/master/)

Python Jupyter dependencies

Code Block
py3X-jupyter
py3X-jupyter_client
py3X-jupyet_contrib_nbextensions
py3X-jupyter_nbextensions_configurator
py3X-hide_code
py3X-RISE

Additionally, you need to download the ARIA-tools documentation repository:

Code Block
https://github.com/aria-tools/ARIA-tools-docs.git

Dependencies and/or requirements for MintPy:

Installation instructions for various OS (as well as notes for Docker users) are encompassed here: https://github.com/insarlab/MintPy/blob/master/docs/installation.md

The following requirements are in the requirements.txt file in the MintPy github repository:

Code Block
cvxopt
dask>=1.0
dask-jobqueue>=0.3
defusedxml
h5py
lxml
matplotlib
numpy
pyproj
pykml
pyresample
scikit-image
scikit-learn
scipy

MintPy and PyAPS repositories:

Code Block
https://github.com/insarlab/MintPy.git
https://github.com/yunjunz/pyaps3.git

The configuration file necessary to run ARIA data through MintPy is attached below (can I attach it to this page?), and can also be found on the MintPy git: https://github.com/insarlab/MintPy/blob/master/docs/examples/input_files/SanFranSenDT42.txt

Running ARIA data through MintPy (the below section is in progress)

This section gives an overview of the necessary steps to run S1-GUNW products from a complete AOI track through the MintPy displacement time-series calculations.

Nominally, for a user outside of HySDS, the first step would be to download the data products of interest using ariaDownload.py from ARIA-tools (https://nbviewer.jupyter.org/github/aria-tools/ARIA-tools-docs/blob/master/JupyterDocs/ariaDownload/ariaDownload_tutorial.ipynb). This command allows the user to specify a data range either spatially (with a specified bounding box or link to a shapefile) or temporally (with start/stop dates or a temporal baseline). However, for the purposes of automation, it would be far more efficient to not have to download S1-GUNW files.

Step 1:

When an S1-GUNW product is created and within a specified AOI for the purposes of monitoring volcanic activity, it triggers the AOI track evaluator. The AOI track evaluator creates a JSON file containing, among other metadata, a list of GUNW products when the full track for the AOI has been completed.

...

  • Is the JSON file creation hard-coded into the current AOI-track-evaluator? – yes

Step 2:

...

  • Should we create the text file from the metadata in the JSON file (contains list of urls for GUNWs), or should we use Elasticsearch?

    • Depends on how much info we want to pass into the MintPy; are we re-processing past data as well? How are we handling the forward keep-up processing, and how will we update the time-series?

    • Could potentially edit AOI-track-evaluator directly, but may be more complex than other options and could potentially lead to issues with other current capabilities of the system.

  • Need to clarify which products we want to do the time-series calculation on - just the most recent pair, or annual/seasonal pairs and nominal nearest 2-neighbors? Would Elasticsearch work better in this case, to compile all relevant data, rather than just using data currently in JSON output of AOI-track-evaluator?

Step 3:

The ariaTSsetup.py code can take a text file containing the urls of each data product as input. The text file contains one url per line, and no other information (no headers, footers, labels, etc).

The bounding box of the AOI can be input using (-b ‘coordinates in SNWE’). To extract meta-data layers from the input data, the user needs to download a DEM (--dem Download). There is also a functionality to download a mask (--mask Download) to remove any water bodies from the data.

Calling ariaTSsetup.py should look like:

Code Block
breakoutModewide
languagepy
ariaTSsetup.py -f 'nameOfTextFile.txt' -b '37.25 38.1 -122.6 -121.75' --mask Download --dem Download

There is also an option to specify a working directory (-w) in which the intermediate products and final outputs are saved. If not otherwise specified, the default working directory is the current directory.

Questions to answer:

  • Where do we want the data to be saved? Just in the working directory (PGE)? Or do we want the stack of interferograms output from ariaTSsetup to be saved in S3 for potential future use?

  • ariaTSsetup.py also takes minimum overlap (-mo) as an input, defined in units of km^2. Do we want to include this? If so, do we want a single defined value, or should this change depending on the AOI? Who should decide this value?

Step 4:

Once the data has been prepared, it can be run through the main MintPy application, smallbaselineApp.py. The input to this command is the custom configuration file (here named SanFranSenDT42.txt). The custom configuration file is attached to this page (can I do this? Any concerns attaching the config file to this page?), and further information about the format is on the MintPy website: https://mintpy.readthedocs.io/en/latest/examples/input_files/ The figure below shows the flow of smallbaselineApp.py.

...

(Figure from Yunjen et al., 2019)

An example call to smallbaselineApp is shown below:

Code Block
languagepy
smallbaselineApp.py -t SanFranSenDT42.txt

The output product is an HDF-EOS5 format file containing the displacement time-series with geometry information.

Questions to answer:

  • The scientists have indicated a preference for raw displacement time-series, without atmopsheric correction - is there a way to accomplish this using the various capabilities of smallbaselineApp? Currently can define a starting (--start STEP) and ending (--end STEP, --stop STEP) processing step, as well as just one step to perform (--dostep STEP). The individual processing steps are all defined on https://nbviewer.jupyter.org/github/insarlab/MintPy-tutorial/blob/master/smallbaselineApp_aria.ipynb

    • Does the functionality of smallbaselineApp allow for stopping and starting again while skipping steps, without needing to reformat data structure?

  • After initial calculation of displacement time-series, we want to update the time-series at a defined cadence. This brings up a few decisions:

    • Do we want to have the time-series update at a user input cadence (individual for each AOI), or just have the cadence pre-designated for all AOI, then design the system to update at that cadence?

    • How do we update the time-series? Re-process all old data? Or combine time-series HDF-EOS5 files after they’re created by MintPy?

      • If combining, how to accomplish this?

      • If re-processing, how/where to store or call to old data products for re-processing?

Processing Pipeline

  • create new diagram of the steps of (a) processing steps and (b) data

  • what are the key steps needed from S1-GUNW -> aria-tools -> MintPy -> L3 displacement time series

  • for each step, find out what input and outputs are, and what condition(s) is used to trigger that step.

  • what are the key datasets

    • type

    • dataset naming convention

  • identify source code of each step.

Draft diagram of Processing Pipeline - to be refined as more information is learned:

...

This diagram branches off from the larger system diagram on Standard Product S1-GUNW Processing Pipeline.

Implementation Notes (Alex Dunn )

User Guide

...

Processing Pipeline

This diagram branches off from the larger system diagram on Standard Product S1-GUNW Processing Pipeline.

...

From a high-level, the main steps of this pipeline are:

  1. The production of a S1-GUNW from within a defined volcanic polygon triggers the MintPy PGE to run. Note: a mintpy volcano polygon will be separate from the larger defined AOI. The enumeration strategy for the encompassed volcano(es) will be entered upon creation of the larger volcanic AOI, which should follow track boundaries.

  2. The PGE takes a start and end time, track number and polygon as inputs (this information is all stored within the trigger rule), and outputs an HDF5 format file containing the displacement time-series for the AOI.

  3. The output of the MintPy PGE is then input for the volcano anomaly detection ML code.

Implementation Notes (Alex Dunn )

References

Video walkthrough of ARIA-Tools and Time Series InSAR (Discussion of how to prepare ARIA data products for use in MintPy begins at around 3:00:23, and all following material relates to MintPy): https://www.youtube.com/playlist?list=PLzmugeDoplFP-Ju8LwWfALyIKLrPWDfbY

...