...
Download the latest topsApp docker container, and unzip it.
Create a .ssh/ folder in your home directory, then ask a team member to securely share with you their .pem SSH keys and a config file. Then put them in that folder.
...
Now use docker run to create the container. The simplest docker run command without any mapping is:
docker run -ti –rm --rm <docker_image_id> bash
Replace <docker_image_id> with the image ID you copied in the previous step. This will take you inside the docker container. The ariamh directory should be in the home directory. Feel free to go inside it and explore.
...
I used input and output files from this job. I also created a _job.json file with these contents in the ~/data/work/ directory:
Code Block |
---|
{ "retry_count": 1 } |
Download the ariamh repository from github; I put it in /Users/YOURPATH/ARIA/repos/ariamh. It doesn’t matter where you put the ariamh directory on your machine as long as you change that part of the full docker run command below to reflect that change. The full docker run command much further down this page will bind that directory to ~/ariamh/ inside the docker container when you run it. This will replace the ~/ariamh/ directory in the docker container with the ariamh directory you downloaded. As a result, you’ll only have the make the following changes once rather than having to repeat them every time you run the docker container.
...
Code Block |
---|
GRQ_URL=https://c-datasets.aria.hysds.io/es ARIA_DAV_URL=https://aria-dav.jpl.nasa.gov ARIA_DAV_U=USERNAME ARIA_DAV_P=PASSWORD ARIA_DEM_URL=http://aria-ops-dataset-bucket.s3-website-us-west-2.amazonaws.com/datasets/dem/SRTM1_v3/ ARIA_NED1_DEM_URL=http://aria-ops-dataset-bucket.s3-website-us-west-2.amazonaws.com/datasets/dem/ned1/ ARIA_NED13_DEM_URL=http://aria-ops-dataset-bucket.s3-website-us-west-2.amazonaws.com/datasets/dem/ned13/ ARIA_WBD_URL=http://aria-ops-dataset-bucket.s3-website-us-west-2.amazonaws.com/datasets/dem/usgs_mirror/SRTMSWBD.003/2000.02.11/ ARIA_DEM_U=USERNAME ARIA_DEM_P=PASSWORD ARIA_WBD_U=USERNAME ARIA_WBD_P=PASSWORD ARIA_PRD_URL=http://aria-ops-dataset-bucket.s3-website-us-west-2.amazonaws.com/datasets/ ARIA_DB_VERSION=v1.1 GRQ_INDEX_PREFIX=grq MOZART_URL=amqp://USERNAME:PASSWORD@IP_ADDRESS:PORT// |
If you’re offsite, you’ll need to use Pulse Secure to set up a JPL full tunnel. Even still, Unmodified, the docker container will time out while requesting some files. So you’ll have to make some changes, starting in ~/ariamh/interferogram/sentinel/create_standard_product_s1.py . In the function get_dataset_by_hash(), change this line:
Code Block |
---|
r = requests.post(search_url, data=json.dumps(query)) |
...
Code Block |
---|
r = requests.post(search_url, data=json.dumps(query), verify=False) |
In the function check_ifg_status_by_hash(), comment the second line shown here:
Code Block |
---|
logger.info("Duplicate dataset found: %s" %found_id)
# sys.exit(0) |
...
) |
Then open the file ~/ariamh/utils/queryBuilder.py and scroll down to look at the function postQuery(). Look for this line:
Code Block |
---|
index, es_url = getIndexAndUrl(sv,conf) |
Immediately after that line, insert this line (making sure the indentation matches the above line):
Code Block |
---|
es_url = "https://c-datasets.aria.hysds.io/es" |
Then change this line:
Code Block |
---|
r = requests.post('%s/%s/_search?search_type=scan&scroll=10m&size=100' %
(es_url, index), data=json.dumps(query)) |
to this:
Code Block |
---|
r = requests.post('%s/%s/_search?search_type=scan&scroll=10m&size=100' %
(es_url, index), data=json.dumps(query), verify=False)I |
Then open ~/ariamh/interferogram/sentinel/fetchCalES.py and look for two lines that look like this:
Code Block |
---|
r = requests.post(search_url, data=json.dumps(query))I |
Change them both to this:
Code Block |
---|
r = requests.post(search_url, data=json.dumps(query), verify=False)I |
Finally, here’s my full docker run command. You’ll have to replace “YOURPATH” with your path:
docker run -ti --rm -v /Users/YOURPATH/.ssh:/home/ops/.ssh -v /Users/YOURPATH/ARIA/data/work:/home/ops/data/work -v /Users/YOURPATH/ARIA/data/work/jobs:/home/ops/data/work/jobs -v /Users/YOURPATH/ARIA/data/work/tasks:/home/ops/data/work/tasks -v /Users/YOURPATH/ARIA/data/work/workers:/home/ops/data/work/workers -v /Users/YOURPATH/ARIA/data/work/cache:/home/ops/data/work/cache:ro -v /Users/YOURPATH/ARIA/.netrc:/home/ops/.netrc:ro -v /Users/YOURPATH/ARIA/.aws:/home/ops/.aws:ro -v /Users/YOURPATH/ARIA/topsapp/settings.conf:/home/ops/ariamh/conf/settings.conf:ro -v /Users/YOURPATH/ARIA/repos/ariamh:/home/ops/ariamh -w /home/ops/data/work <docker_image_id>
If you’re off-lab, you’ll need to use Pulse Secure to set up a JPL full tunnel.
Once all this works, make sure your ~/data/work directory is filled with the SLC zip files, orbit files, and _job.json and _context.json. For instance, my work directory has these contents:
...
Then run: ~/ariamh/interferogram/sentinel/create_standard_product_s1.sh
Then wait. For a while. Grab a coffee or threeIf you’ve done everything correctly, this should run for a few hours (using a new MacBook Pro in late 2019). A correctly executed topsApp run should fill the ~/data/work/ directory with files, including a directory with a name that looks like S1-GUNW-A-R-064-tops-20191008_20190908-015107-37379N_35506N-PP-4488-v2_0_2. That directory contains the output interferogram in netCDF format and a PNG preview called S1-GUNW-A-R-064-tops-20191008_20190908-015107-37379N_35506N-PP-4488-v2_0_2.interferogram.browse_full.png.
OPTIONAL BUT USEFUL:
While that is running, get details by tailing the log in a new tab in the same docker terminal:
...