Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Additionally, you need to download the ARIA-tools documentation repository:

Code Block
ARIA-tools: https://github.com/aria-tools/ARIA-tools-docs.git

...

MintPy and PyAPS repositories:

Code Block
MintPy: https://github.com/insarlab/MintPy.git
PyAPS:
https://github.com/yunjunz/pyaps3.git

...

When an S1-GUNW product is created and within a specified AOI for the purposes of monitoring volcanic activity, it triggers the AOI track evaluator. The AOI track evaluator creates a JSON file containing, among other metadata, a list of GUNW products when the full track for the AOI has been completed.

Alternatively, S1-GUNW production for a specified AOI track could also be triggered upon the user first defining the AOI. The system will pull relevant SLCs and other data to create the S1-GUNWs for the AOI, if the GUNWs for that area were not already extant.

Questions to answer:

  • Is the JSON file creation hard-coded into the current AOI-track-evaluator? – yes

...

  • Should we create the text file from the metadata in the JSON file (contains list of urls for GUNWs), or should we use Elasticsearch?

    • Depends on how much info we want to pass into the MintPy; are we re-processing past data as well? How are we handling the forward keep-up processing, and how will we update the time-series?

    • Could potentially edit AOI-track-evaluator directly, but may be more complex than other options and could potentially lead to issues with other current capabilities of the system.

  • Need to clarify which products we want to do the time-series calculation on - just the most recent pair, or annual/seasonal pairs and nominal nearest 2-neighbors? Would Elasticsearch work better in this case, to compile all relevant data, rather than just using data currently in JSON output of AOI-track-evaluator?

Step 3:

The ariaTSsetup.py code can take a text file containing the urls of each data product as input. The text file contains one url per line, and no other information (no headers, footers, labels, etc).

The bounding box of the AOI can be input using (-b ‘coordinates in SNWE’). To extract meta-data layers from the input data, the user needs to download a DEM (--dem Download). There is also a functionality to download a mask (--mask Download) to remove any water bodies from the data.

...

Once the data has been prepared, it can be run through the main MintPy application, smallbaselineApp.py. The input to this command is the custom configuration file (here named SanFranSenDT42.txt). The custom configuration file is attached to this page (can I do this? Any concerns attaching the config file to this page?), and further information about the format is on the MintPy website: https://mintpy.readthedocs.io/en/latest/examples/input_files/ The figure below shows the flow of smallbaselineApp.py.

...

  • The scientists have indicated a preference for raw displacement time-series, without atmopsheric correction - is there a way to accomplish this using the various capabilities of smallbaselineApp? Currently can define a starting (--start STEP) and ending (--end STEP, --stop STEP) processing step, as well as just one step to perform (--dostep STEP). The individual processing steps are all defined on https://nbviewer.jupyter.org/github/insarlab/MintPy-tutorial/blob/master/smallbaselineApp_aria.ipynb

    • Does the functionality of smallbaselineApp allow for stopping and starting again while skipping steps, without needing to reformat data structure?

  • After initial calculation of displacement time-series, we want to update the time-series at a defined cadence. This brings up a few decisions:

    • Do we want to have the time-series update at a user input cadence (individual for each AOI), or just have the cadence pre-defineddesignated for all AOI, then design the system to update at that cadence?

    • How do we update the time-series? Re-process all old data? Or combine time-series HDF-EOS5 files after they’re created by MintPy?

      • If combining, how to accomplish this?

      • If re-processing, how/where to store or call to old data products for re-processing?

...

  • create new diagram of the steps of (a) processing steps and (b) data

  • what are the key steps needed from S1-GUNW -> aria-tools -> MintPy -> L3 displacement time series

  • for each step, find out what input and outputs are, and what condition(s) is used to trigger that step.

  • what are the key datasets

    • type

    • dataset naming convention

  • identify source code of each step.

Draft diagram of Processing Pipeline - to be refined as more information is learned:

...

This diagram branches off from the larger system diagram on Standard Product S1-GUNW Processing Pipeline.

Implementation Notes (Alex Dunn )

...