The ITS_LIVE monitoring stack provides the AWS architecture to support low-latency production of netCDF glacier velocity products produced from Optical (Landsat 8/9, Sentinel-2) and SAR (Sentinel-1) image pairs.
ITS_LIVE Monitoring uses a pub-sub model for the optical missions. These Open Data on AWS datasets include SNS Topics to which messages are published for each new scene added to the dataset:
- Landsat: https://registry.opendata.aws/usgs-landsat/
- Sentinel-2: https://registry.opendata.aws/sentinel-2/
ITS_LIVE Monitoring subscribes to these messages and collects them in an SQS Queue. An AWS Lambda function consumes messages from the SQS Queue and:
- determines if the scene in the message should be processed
- searches the dataset's catalog for secondary scenes to form processing pairs
- ensures these pairs haven't already been processed
- submits the scene pairs to HyP3 for processing
To create a development environment, run:
conda env update -f environment.yml
conda activate its-live-monitoring
A Makefile
has been provided to run some common development steps:
make static
runs the static analysis suite, includingruff
for linting and formatting of Python code, andcfn-lin
for linting CloudFormation.make test
runs the PyTest test suite.
Review the Makefile
for a complete list of commands.
Many parts of this stack are controlled by environment variables. Refer to the deploy-*.yml
GitHub Actions workflows to see which are set upon deployment. Below is a non-exhaustive list of some environment variables that you may want to set.
HYP3_API
: The HyP3 deployment to which jobs will be submitted, e.g. https://hyp3-its-live.asf.alaska.edu.EARTHDATA_USERNAME
: Earthdata Login username for the account which will submit jobs to HyP3. In the production stack, this should the ITS_LIVE operational user; in the test stack, this should be the team testing user.EARTHDATA_PASSWORD
: Earthdata Login password for the account which will submit jobs to HyP3.
The Lambda functions can be run locally from the command line, or by calling the appropriate function in the Python console.
Note
To call the functions in the python console, you'll need to add all the src
directories to your PYTHONPATH
. With PyCharm, you can accomplish this by marking all such directories as "Sources Root" and enabling the "Add source roots to PYTHONPATH" Python Console setting.
To show the help text for the its_live_monitoring
Lambda function, which is used to submit new Landsat 8/9 scenes for processing:
python its_live_monitoring/src/main.py -h
For example, processing a valid scene:
python its_live_monitoring/src/main.py LC08_L1TP_138041_20240128_20240207_02_T1
The its_live_monitoring
monitoring Lambda can be tested by manually publishing messages to the test SNS topics which was manually provisioned in the AWS Console.
aws sns publish \
--topic-arn ${TOPIC_ARN} \
--message file://${MESSAGE_FILE}
where TOPIC_ARN
is the ARN of the test topic and MESSAGE_FILE
is the path to a file containing the contents of the message you want published. Example message contents are provided in these files in the tests/integration
directory, two of which are described here:
landsat-l8-valid.json
- A message containing a Landsat 9 scene over ice that should be processed.landsat-l9-wrong-tier.json
- A message containing a Landsat 9 scene not over ice that should be filtered out and not processed.
To submit all the integration test payloads to the default test SNS topics, run:
make integration
Important
The integration tests will submit jobs to hyp3-its-live-test
, which will publish products to s3://its-live-data-test
. Notably s3://its-live-data-test
has a lifecycle rule which will delete all products after 14 days. So to test deduplication of HyP3 and S3, you'll need to:
- disable
hyp3-its-live-test
's compute environment or start execution manager - submit the integration tests and see jobs submitted
- submit the integration tests again to see all jobs deduplicate with the hung jobs from the previous step
- re-enable the compute environment or start execution manager and wait for all jobs to finish
- once all jobs are finished, submit the integration tests again to see jobs deduplicate against the products in
s3://its-live-data-test
That means, fully testing of its-live-monitoring requires at least 3 rounds of integration testing!
To submit just the Landsat integration test payloads to the default Landsat test SNS topic, run:
make landsat-integration
Likewise, to submit just the Sentinel-2 integration test payloads to the default Sentinel-2 test SNS topic, run:
make Sentinel2-integration
or, you can submit to an alternative SNS topic like:
LANDSAT_TOPIC_ARN=foobar make landsat-integration
SENTINEL2_TOPIC_ARN=foobar make sentinel2-integration