All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to PEP 440 and uses Semantic Versioning.
- The
static-analysis
Github Actions workflow now usesruff
rather thanflake8
for linting.
- Logger is now configured in process.main() so paths to reference/secondary scenes will now be logged.
- Fetch Sentinel-2 scenes from AWS S3 (if present); otherwise continue to fetch from Google Cloud Storage.
- The M11/M12 variables produced by the hyp3_autorift and s1_correction workflows will be written as
float32
instead of the previous compressedint16
variables that did not take advantage of the full dynamic range and thus lost a significant amount of precision.
- Orbits are now downloaded using
s1_orbits
rather thanhyp3lib
.
- Removed support for the
--esa-username
and--esa-password
command-line options and theESA_USERNAME
andESA_PASSWORD
environment variables.
- The conversion matrices netCDF file created bt the S1 correction workflow is now called
conversion_matricies.nc
and no longer includes the scene name per feedback from JPL.
s2_isce2.generate_correction_data
now returns a Path instead of a str as expected byhyp3lib.aws.upload_file_to_s3
.s2_isce2.create_conversion_matricies
now uses the pixel-center instead of the upper-left corner for the x,y dimensions.s2_isce2.create_conversion_matricies
now explicitly syncs data to and closes the netCDF file to prevent corrupt files from being uploaded.
- The Sentinel-1 correction workflow will now calculate and write the M11/M12 conversion matrices to a netCDF file.
hyp3_autorift.crop
will now preserve theadd_offset
andscale_factor
encoding attributes for all variables, and in particular, for the M11/M12 conversion matrices.
- Support for Python 3.8 has been dropped.
- In preparation for a major update, the Sentinel-1 processing workflow has been isolated to a new
hyp3_autorift.s1_isce2
module.
hyp3_autorift
will no longer attempt to crop files with no valid data
- The unused
ASF
naming scheme has been removed from thehyp3_autorift
CLI and thehyp3_autorift.process
function
- Everything in
hyp3_autorift.io
has been moved intohyp3_autorift.utils
to prevent shadowing the builtinio
module hyp3_autorift.process.process
now returns the product file, browse image, and (new) thumbnail image
--publish-bucket
option has been added to the HyP3 entry point to additionally publish products an AWS bucket, such as the ITS_LIVE AWS Open Data bucket,s3://its-live-data
.upload_file_to_s3_with_publish_access_keys
to perform S3 uploads using credentials from thePUBLISH_ACCESS_KEY_ID
andPUBLISH_SECRET_ACCESS_KEY
environment vairables.
- Upgraded to
hyp3lib=>3,<4
from>=2,<3
utils.get_esa_credentials
to check for the existence of CDSE credentials before processing begins.
- Updated
hyp3lib
to v2.0.2+, which uses the new Copernicus Data Space Ecosystem (CDSE) API to download orbit files. - Calls to
downloadSentinelOrbitFile
now specify theesa_credentials
argument.
- Upgraded to ASFHyP3/actions v0.8.3
hyp3-autorift
now uses asrc
layout per this recommendation.hyp3-autorift
now only usespyproject.toml
for package creation now thatsetuptools
recommends not using setup.py.
- Patch 235 was applied to make it easier for users to correct for ionosphere streaks without needing to know the scale factor.
hyp3_autorift.crop
provides acrop_netcdf_product
function to crop HyP3 AutoRIFT products to the extent of validv
data
- HyP3 AutoRIFT products generated with the main workflow will be cropped to the extent of the valid
v
data
- Patch 227 was applied to align the S1 granules velocity description with the optical products
- Patch 223 was applied so that the polarization is correctly selected in the sentinel-1 workflow
hyp3_autorift
's main entrypoint now accepts++process
arguments to support multiple workflows++process hyp3_autorift
(default) will run the same autoRIFT pair processing workflow++process s1_correction
will run a Geogrid-only workflow to create the GeoTIFFs necessary for correcting the scale-projection issue in polar-sterographic products generated from Sentinel-1 pairs that were created using HyP3 autoRIFT versions < 0.9.0, which was released November 2, 2022
- Patch nasa-jpl/autorift#78 was applied from upstream to support the Sentinel-1 correction workflow
- The unused
autorift_proc_pair
console script entrypoint was removed
- The zero mask and nodata value for wallis-filtered Landsat-7 images are now set appropriately
- Early (SLC-On) Landsat-7 images are no longer incorrectly filtered a second time with the high-pass filter
- The search range and %-valid pixels are now correctly calculated for Landsat-7 images
- Landsat 7+8 pairs will be filtered appropriately; see #201
--omp-num-threads
parameter to themain()
entrypoint to limit the number of threads used by ISCE during multiprocessing.
hyp3_autorift
will now ensure both scenes are in the same projection for Landsat missions
- Patch 196 was applied to update the
flag_meanings
netCDF attribute to be inline with CF-Convention 1.8, as described in the vendored software README.md
- Patches 189, 191, and 194 were applied to update some netCDF variable attributes, as described in the vendored software README.md
- Sentinel-2 scenes are now retrieved from Google Cloud, rather than AWS.
- Sentinel-2 granules may no longer be specified using Element84 COG names, only ESA names.
- A patch was applied to update some netCDF variable attributes, as described in the vendored software README.md
- Added support for processing Landsat-4, -5, and -7 Collection 2 scenes
hyp3_autorift.process.get_lc2_stac_json_key
will now work for landsat missions 4-9 and for all sensors
- Upgraded autoRIFT to v1.5.0 and ISCE2 v2.6.1 built with autoRIFT v1.5.0
- Pinned Python to
<3.10
as ISCE2 is currently incompatible with Python 3.10. This restriction will be lifted once the conda-forge distribution of ISCE2 is compatible with Python 3.10
- Updated the USGS STAC catalog API endpoint
- Datetime information can now be correctly extracted from 25-character S2 Earth Search names. Fixes #152
- The Earth Search STAC catalog is incomplete for Sentinel-2 L1C, with many more scenes in the AWS bucket than the
catalog. When a S2 scene cannot be found in the STAC catalog,
hyp3-autorift
will fall back to a bundled S2 metadata catalog derived from an inventory of scenes in the AWS bucket and the Google Earth catalog
- A GDAL issue preventing Sentinel-1 processing introduced in v0.8.1
- Vendored
testGeogrid_ISCE.py
andtestGeogridOptical.py
scripts no longer raise anException('Optical data NOT supported yet!')
when processing scenes in thes3://s2-l1c-us-west-2
bucket
- When processing Sentinel-2 scenes,
hyp3_autorift
will now prefer scenes available ins3://s2-l1c-us-west-2
over the canonicals3://sentinel-s2-l1c
bucket in theeu-central-1
region
- Upgraded to hyp3lib v1.7.0 from v1.6.8
hyp3_autorift
can now process Landsat 9 Collection 2 scenes
- It is now possible to inject Earthdata username and password using environment variables:
EARTHDATA_USERNAME
andEARTHDATA_PASSWORD
.
- The
opencv
conda package has been pinned to4.5.3
due to a breaking change to itslibopencv_core.so.*
naming scheme in4.5.5
.
- Patches were applied to clean up some netCDF variable attributes, as described in the vendored software README.md
- Default autoRIFT parameter file was updated to point at the new
its-live-data
AWS S3 bucket instead ofits-live-data.jpl.nasa.gov
- In the netCDF product, the default autoRIFT parameter file URL will always be reported instead of reporting copies used to support custom HyP3 deployments
- A patch was applied to fix some Sentinel-1 and Sentinel-2 product metadata, as described in the vendored software README.md
- Updated the upgrade to autoRIFT
v1.4.0
to account for the autoRIFT source repo having moved thev1.4.0
tag (between commits67e4996..b6700f9
) and changed the conda-forge package accordingly (new sha256 and bumped the build number).- The autoRIFT workflow scripts are now based on the moved tag and any still necessary fixes were applied as described in the vendored software README.md
- For Sentinel-2 products, file names now include the full COG Id to ensure unique file names are produced and to be consistent with other products.
- Geogrid now points to
fine_coreg
for Sentinel-1 workflows, fixing commonIndexError
failures
- Applied a fix to the autoRIFT packaging script that updates the reference velocity fields for projected velocity, as described in the vendored software README.md
- Upgraded autoRIFT to v1.4.0 and ISCE2 v2.5.3 built with autoRIFT v1.4.0
- Applied some fixes to the autoRIFT workflow scripts as described in the vendored software README.md
hyp3_autorift.io.save_topsinsar_mat
has been renamed tohyp3_autorift.io.get_topsinsar_config
- It no longer writes a config
.mat
file and instead returns the config dictionary
- It no longer writes a config
topsinsar_filename.py
console script entrypoint has been removed (usehyp3_autorift.io.get_topsinsar_config
instead)- The
hyp3_autorift/netcdf_output.py
module has been removed in favor of the (new) vendoredhyp3_autorift/vend/netcdf_output.py
- Applied some minor error estimate fixes and netCDF metadata attribute updates as described in the vendored software README.md
process.get_lc2_metadata()
now attempts to fetch STAC metadata from the https://landsatlook.usgs.gov/ API and falls back the STAC json in the S3 bucket
hyp3_autorift.process
and the associatedautorift_proc_pair
entrypoint no longer accept theband
argument, which wasn't being used.
conda-env.yml
has been renamed toenvironment.yml
in-line with community practice- Upgraded to hyp3lib v1.6.8 from v1.6.7
- Upgrade to ISCE2 v2.5.2 built with autoRIFT v1.3.1
- Upgraded autoRIFT to v1.3.1 and ISCE2 v2.5.1 built with autoRIFT v1.3.1
- Upgraded autoRIFT to v1.3.0 and ISCE2 v2.5.1 built with autoRIFT v1.3.0
process.get_lc2_metadata()
now fetches STAC metadata from theusgs-landsat
S3 bucket instead of the https://landsatlook.usgs.gov/ API.
- Minor updates to the netCDF product metadata
- Pairs that end up having a 0 ROI (no valid data) will not fail at the end of processing, but instead will upload a netCDF product that reports 0 ROI
- autoRIFT products will now be generated with a 120 m pixel spacing instead of 240 m
hyp3_autorift
will directly access Landsat-8 and Sentinel-2 data in the cloud instead of downloading the scenes locally- Upgraded autoRIFT to v1.2.0 and ISCE2 v2.5.1 built with autoRIFT v1.2.0
- Sentinel-2 L1C metadata is generated correctly
- Sentinel-2 search by ESA granule id
- Landsat-8 scene names are no longer truncated in the
ITS_LIVE
naming schemes
- Upgraded to hyp3lib v1.6.7 from v1.6.2
- Log message prior to downloading Sentinel-2 and Landsat 8 products
- Ability to specify shapefile used to determine the correct search parameters by geographic location
- a
--parameter-file
option has been added tohyp3_autorift
- a
parameter_file
keyword argument has been added tohyp3_autorift.process.process
- a
- Ability to specify a preferred product naming scheme
- a
--naming-scheme
option has been added tohyp3_autorift
- a
naming_scheme
keyword argument has been added tohyp3_autorift.process.process
- a
- A partial fix was implemented to correct out of index errors when processing optical scenes (typically seen with Landsat-8 pairs) due to calculating different overlapping subset sizes when co-registering the images. Currently, only the smallest subset size is used, so the bounding box may be 1px too small in x and/or y, but there shouldn't be any pixel offsets. Full fix will need to be implemented upstream in autoRIFT.
hyp3_autorift
will determine the polarization of Sentinel-1 scenes based on reference scene to allow for VV in addition to HH processing.
autorift_proc_pair
entrypoint no longer accepts a-p
/--polarization
optionhyp3_autorift.process.process
no longer accepts apolarization=
keyword argument
- ValueError exception when processing scenes with short (23 char) Element 84 Sentinel-2 IDs
HyP3 v1 is no longer supported as of this release.
- Added support for global processing (previously only Greenland and Antarctica) by pointing at the new autoRIFT parameter files provided by JPL
- Added support for processing Landsat-8 Collection 2 scene pairs
- Example documentation for submitting autoRIFT jobs via the HyP3 SDK or HyP3 API
- Sentinel-2 support now targets level-1c products instead of level-2a products to remove baked in slope correction
hyp3_autorift
entrypoint point now kicks off HyP3 v2 processing (options have changed! see--help
)
- 1/2 pixel offset in netCDF file due to gdal and netCDF using different pixel reference points
- The
autorift
entrypoint and HyP3 v1 support has been removed - The
hyp3_autorift_v2
entrypoint has been removed (now justhyp3_autorift
)
- Browse images and thumbnails will be generated using the ITS_LIVE colormap
- Failures due to stable surface misclassification
- Installed autoRIFT v1.0.8 for processing optical scenes (in addition to v1.0.7 already installed as part of ISCE)
- Added support for processing Sentinel-2 scene pairs
- Removed the "include intermediate files" option when running jobs via HyP3 v1
hyp3_autorift
now requires python >=3.8, and depends on ISCE >=2.4.1 which includes autoRIFT 1.0.7- Upgraded to hyp3lib v1.6.1 from v1.5.0
- Output product names have change to follow HyP3's standard pair-processing naming scheme
- Browse images are now uploaded for hyp3v1 and will appear in email notifications
- NetCDF product files include a
source
andreference
global attribute in line with CF-Conventions (see PR #20)
- A browse image of the ice velocity is produced for HyP3v1 and v2, and a thumbnail of the browse image will be produced for HyP3v2
- Restrict ISCE version to 2.4.0 which includes autoRIFT 1.0.6
Initial release of hyp3-autorift, a HyP3 plugin for feature tracking processing with AutoRIFT-ISCE. This plugin consists of:
hyp3_autorift
, apip
installable python package that runs the autoRIFT inside the HyP3 plugin. This package provides:autorift
entrypoint used as they HyP3 plugin (container) entrypoint, which passes arguments down to the selected HyP3 version entrypoints:hyp3_autorift
hyp3_autorift_2
autorift_proc_pair
for running the autoRIFT process for Sentinel-1 image pairs (independent of HyP3)
- a
Dockerfile
to build the HyP3 plugin - GitHub Actions workflows that will build and distribute hyp3-autorift's python package and HyP3 plugin