Backend
...
BOS Sarcat scraper (stand-alone CLI tool)
https://github.com/aria-jpl/bos_sarcat_scraper
https://github.com/aria-jpl/bos_sarcat_scraper/blob/master/bos_sarcat_scraper/bosart_scrape.py
This script queries BOS and outputs a json of te result set
Inputs
start/end time
OR
Since last ingest time on BOS
BOS expects wkt format for query
Front end GUI
Inside bos_sarcat_scraper/facetview/ repo on https://github.com/aria-jpl/bos_sarcat_scraper/blob/master/facetview/facetview-saravail.html
Front end Faceted search is housed on aria-puccini
To edit the template:
...
Instructions: https://github.com/aria-jpl/bos_sarcat_scraper/tree/master/dataset
PGE: bos_ingest
https://github.com/aria-jpl/bos_sarcat_scraper/blob/master/docker/job-spec.json.bos_ingest
Runs hourly on 2-hour
This is the HySDS PGE wrapper for https://github.com/aria-jpl/bos_sarcat_scraper/blob/master/bos_sarcat_scraper/bosart_scrape.py
Avoids ingesting past PLANNED and PREDICTED
Calls ingest from inside the PGE
If forgot to delete the folder, then verdi will try to ingest it if the dataset dir is still there.
Bug if cannot remove dir and verdi
Currently no retries. If job failed due to any reason like ES timeout, then job fails. Currently relying on next scrubber to run. But only get 3-hour sliding window, so about 3 tries.
Still need a daily 5-day window back-filler to run bos ingest.
If manual publishing succeeds or it is existing, then will delete the directory (acq_id)
if it fails, it will let Verdi post-processing take another crack at it
job-spec
Code Block language json { "required_queues": ["factotum-job_worker-small"], "container": "container-aria-jpl_bos_sarcat_scraper:master", "command": "/home/ops/verdi/ops/bos_sarcat_scraper/create_acquisitions.sh", "imported_worker_files": { "/home/ops/verdi/etc/datasets.json": "/home/ops/verdi/etc/datasets.json" }, "disk_usage": "1GB", "params": [ { "name": "bos_ingest_time", "destination": "context" }, { "name": "from_time", "destination": "context" }, { "name": "end_time", "destination": "context" } ] }
hysds_io
Code Block language json { "submission_type": "individual", "component": "tosca", "label": "Ingest acquisitions from Bos SARCAT", "allowed_accounts": [ "ops" ], "params" : [ { "name": "bos_ingest_time", "from": "submitter", "type": "text", "optional": true, "placeholder":"start of bos_ingest_timestamp in format yyyy-mm-ddThh:mm:ss.sssZ" }, { "name": "from_time", "from": "submitter", "type": "text", "optional": true, "placeholder":"start of acquisition time in format yyyy-mm-ddThh:mm:ss.sssZ" }, { "name": "end_time", "from": "submitter", "type": "text", "optional": true, "placeholder":"end of acquisition time in format yyyy-mm-ddThh:mm:ss.sssZ" } ] }
PGE: scrub_outdated_bos_acqs
Runs daily
Scrubs outdated PLANNED and PREDICTED older than 2 days
Cron scripts
Crontab settings: https://github.com/aria-jpl/bos_sarcat_scraper/blob/master/crons/crontab-setting.txt
...
https://github.com/aria-jpl/bos_sarcat_scraper/blob/master/crons/ingest-cron.py
Code Block language bash 0 0 * * * /home/ops/verdi/bin/python /home/ops/ingest-cron.py --days 2 --tag master > /home/ops/verdi/log/ingest-cron.log 2>&1 0 * * * * /home/ops/verdi/bin/python /home/ops/ingest-cron.py --hours 2 --tag master > /home/ops/verdi/log/ingest-cron.log 2>&1
https://github.com/aria-jpl/bos_sarcat_scraper/blob/master/crons/scrub-cron.py
Code Block 0 0 * * * /home/ops/verdi/bin/python /home/ops/verdi/ops/bos_sarcat_scraper/crons/scrub-cron.py > /home/ops/verdi/scrub-cron.log 2>&1
This runs on b-cluster factotum currently
Flask App Services
ICS
KML
CSV
Other
Location
Log files
Debugging process
Deployment
Watchdogs to check on hourly scraper already in. current checks on bos_ingest_:master
ES on b-cluster
Alias for sar-availability: acquisition
...