diff --git a/.nojekyll b/.nojekyll new file mode 100644 index 0000000..e69de29 diff --git a/404.html b/404.html new file mode 100644 index 0000000..639ac8e --- /dev/null +++ b/404.html @@ -0,0 +1,1955 @@ + + + +
+ + + + + + + + + + + + + + +Previous OGC Testbeds 13-16 initiated the design of an application package for Earth Observation Applications in distributed Cloud Platforms
+The application package provides information about the software item, metadata and dependencies
+Deployed and executed within an Exploitation Platform in a service compliant with the OGC API Processes specification
+OGC 20-089 Defines the Best Practice to package and deploy Earth Observation Applications in an Exploitation Platform
+OGC 20-089 defines guidance for the 3 viewpoints: +* For a Developer to adapt an application +* For an Integrator to package an application +* For an Platform to deploy and execute the application
+12 submitters organisations:
+Decouple application developers from exploitation platform operators and from application consumers:
+Enable exploitation platforms to virtually support any type of packaged EO application
+The Application Package:
+Application developers:
+The Common Workflow Language (CWL) is an open standard for describing analysis workflows and tools in a way that makes them portable and scalable across a variety of software and hardware environments, from workstations to cluster, cloud, and high performance computing environments.
+The CWL is a specification for describing analysis workflows and tools.
+CWL workflows are portable and scalable across a variety of software and hardware environments, from workstations to cluster, cloud, and high performance computing (HPC) environments.
+CWL is designed to meet the needs of data-intensive science, such as Bioinformatics, Medical Imaging, Astronomy, Physics, and Chemistry.
+ + + + + + +Purpose: to crop a particular band defined as a common band name (such as the "green" or "nir" band) from a satellite image acquired by either Sentinel-2 or Landsat-9.
+This step is highlighted below:
+The crop.py
script is a command-line tool that takes as input
and then crops the specified band from the asset associated with the common band name to the specified AOI.
+It uses various Python libraries like pystac
, rasterio
, pyproj
, shapely
, and loguru
.
Here is an overview of the script's functionality:
+It defines a function aoi2box
to convert an AOI expressed as a bounding box string into a list of floats.
It defines a function get_asset
to retrieve the asset of a STAC Item that is defined with a common band name. It iterates through the assets and checks if a band has the specified common name.
It defines a command-line interface using click
, with options for providing the input STAC Item URL, AOI, EPSG code, and common band name.
The crop
function is the main entry point. It reads the STAC Item specified by the input URL and retrieves the asset associated with the common band name. It then crops the asset to the specified AOI using the rasterio library.
It transforms the bounding box coordinates to match the EPSG code provided.
+It performs the cropping using the rasterio.mask.mask
function.
It writes the cropped image to a GeoTIFF file with a filename like "crop_bandname.tif."
+The script is executable as a command-line tool as its usage is:
+Usage: app.py [OPTIONS]
+
+ Crops a STAC Item asset defined with its common band name
+
+Options:
+ --input-item TEXT STAC Item URL or staged STAC catalog [required]
+ --aoi TEXT Area of interest expressed as a bounding box [required]
+ --epsg TEXT EPSG code [required]
+ --band TEXT Common band name [required]
+ --help Show this message and exit.
+
To use this script, you would typically run it from the command line, providing the necessary input options such as the STAC Item URL, AOI, EPSG code, and common band name:
+python app.py \
+ --input-item "https://earth-search.aws.element84.com/v0/collections/sentinel-s2-l2a-cogs/items/S2B_10TFK_20210713_0_L2A" \
+ --aoi "-121.399,39.834,-120.74,40.472" \
+ --epsg "EPSG:4326" \
+ --band "green"
+
It will then crop the specified band from the STAC asset and save it as a GeoTIFF file.
+The Python code is provided here:
+water-bodies/command-line-tools/crop/app.py | |
---|---|
1 + 2 + 3 + 4 + 5 + 6 + 7 + 8 + 9 + 10 + 11 + 12 + 13 + 14 + 15 + 16 + 17 + 18 + 19 + 20 + 21 + 22 + 23 + 24 + 25 + 26 + 27 + 28 + 29 + 30 + 31 + 32 + 33 + 34 + 35 + 36 + 37 + 38 + 39 + 40 + 41 + 42 + 43 + 44 + 45 + 46 + 47 + 48 + 49 + 50 + 51 + 52 + 53 + 54 + 55 + 56 + 57 + 58 + 59 + 60 + 61 + 62 + 63 + 64 + 65 + 66 + 67 + 68 + 69 + 70 + 71 + 72 + 73 + 74 + 75 + 76 + 77 + 78 + 79 + 80 + 81 + 82 + 83 + 84 + 85 + 86 + 87 + 88 + 89 + 90 + 91 + 92 + 93 + 94 + 95 + 96 + 97 + 98 + 99 +100 +101 +102 +103 +104 |
|
Purpose: to calculate the normalized difference of the "green" or "nir" bands.
+This step is highlighted below:
+The norm_diff
script is a command-line tool for performing a normalized difference between two raster images.
It uses the click
, rasterio
, and numpy
libraries to perform the calculation and save the result as a GeoTIFF file.
Here's an overview of what the script does:
+It defines a command-line interface using the click
library, with two arguments for providing the file paths of the two raster images that you want to calculate the normalized difference for.
The normalized_difference
function is the main entry point. It opens the two input raster files specified as arguments.
It reads the data from the first raster (specified by rasters[0]
) using rasterio
, and it also copies the metadata (e.g., projection
, geotransform
) of this raster to be used for the output.
It then opens the second raster (specified by rasters[1]
) and reads its data.
It updates the data type in the metadata to "float32"
because the normalized difference result will be a floating-point image.
It creates an output raster named "norm_diff.tif" using rasterio
. This output raster will have the same metadata as the first input raster, but it will be of data type float32
.
It calculates the normalized difference between the two input arrays (array1 - array2) / (array1 + array2)
and writes it to the output raster using dst_dataset.write()
.
The script is executable as a command-line tool as its usage is:
+Usage: app.py [OPTIONS] RASTERS...
+
+ Performs a normalized difference
+
+Options:
+ --help Show this message and exit.
+
The Python code is provided here:
+water-bodies/command-line-tools/norm_diff/app.py | |
---|---|
1 + 2 + 3 + 4 + 5 + 6 + 7 + 8 + 9 +10 +11 +12 +13 +14 +15 +16 +17 +18 +19 +20 +21 +22 +23 +24 +25 +26 +27 +28 +29 +30 +31 +32 +33 +34 +35 +36 |
|
This Python script is a command-line tool for performing a normalized difference operation on two input raster images. The normalized difference is commonly used in remote sensing and image processing to highlight certain features or properties in the images.
+Here's an overview of the script's functionality:
+It imports necessary libraries and modules, including click`` for defining a command-line interface,
numpy` for numerical operations,
gdalfor geospatial operations, and
loguru` for logging.
The normalized_difference
function is defined to perform the normalized difference operation. It takes two raster image file paths as input.
Inside the normalized_difference
function, it logs the names of the input raster files.
It sets up numpy
to allow division by zero and defines GDAL datasets for both input raster images.
The script creates a new output GeoTIFF file named "norm_diff.tif"
to store the result. It preserves the geospatial information (geotransform and projection) from the first input raster.
It reads the pixel values from both input datasets and converts them to floating-point arrays.
+It calculates the normalized difference as (array1 - array2) / (array1 + array2)
and writes the result to the output dataset.
The output dataset is closed, and both input datasets are closed.
+Finally, the script logs "Done!" to indicate that the operation is complete.
+To use this script, one provides two input raster images as arguments. The script will calculate the normalized difference between the two images, save the result as "norm_diff.tif"
and preserve the geospatial information from the first input image.
Purpose: to apply the Otsu threshold to the normalized difference.
+This step is highlighted below:
+The otsu
Python script is a command-line tool for applying the Otsu threshold to a single input raster image.
It uses the click
, rasterio
, numpy
, skimage.filters
, and loguru
libraries.
Here's an overview of what the script does:
+It defines a command-line interface using the click
library, with a single argument for providing the file path of the input raster image on which you want to apply the Otsu threshold.
The otsu
function is the main entry point. It opens the input raster file specified as the argument.
It reads the data from the input raster using rasterio
and also copies the metadata (e.g., projection
, geotransform
) of this raster to be used for the output.
It applies the Otsu threshold to the input array by calling the threshold
function. The threshold_otsu
function from skimage.filters
is used to calculate the Otsu threshold. The thresholding process marks pixels as True or False based on whether they are greater than the calculated threshold.
It creates an output raster named "otsu.tif" using rasterio
. This output raster will have the same metadata as the input raster.
It writes the thresholded binary image to the output raster using dst_dataset.write().
+The result, a binary image where pixel values are either True or False based on the thresholding, will be saved as "otsu.tif" in the same directory where the script is executed.
+The script is executable as a command-line tool as its usage is:
+Usage: app.py [OPTIONS] RASTER
+
+ Applies the Otsu threshold
+
+Options:
+ --help Show this message and exit.
+
The Python code is provided here:
+water-bodies/command-line-tools/otsu/app.py | |
---|---|
1 + 2 + 3 + 4 + 5 + 6 + 7 + 8 + 9 +10 +11 +12 +13 +14 +15 +16 +17 +18 +19 +20 +21 +22 +23 +24 +25 +26 +27 +28 +29 +30 +31 +32 +33 +34 |
|
This Python script is a command-line tool for applying the Otsu threshold to a raster image. The Otsu thresholding method is a technique used for image segmentation to separate objects of interest from the background. Here's an overview of the script's functionality:
+It imports necessary libraries and modules, including click for defining a command-line interface, numpy for numerical operations, gdal for geospatial operations, skimage.filters for Otsu thresholding, and loguru for logging.
+The threshold function takes a numpy array as input and returns the binary result of applying the Otsu threshold to the array. The Otsu threshold is calculated using the threshold_otsu function from the skimage.filters module.
+The otsu function is defined to apply the Otsu threshold to the input raster image. It takes the path to the input raster as an argument.
+Inside the otsu function, it logs the name of the input raster file.
+It opens the input raster as a GDAL dataset and creates a new output GeoTIFF file named "otsu.tif."
+The script sets the geospatial information (geotransform and projection) for the output dataset to match the input raster.
+It reads the pixel values from the input dataset and converts them to a floating-point array.
+The Otsu thresholding is applied to the array, and the binary result is written to the output dataset. The output dataset uses a data type of GDT_Byte.
+The NoData value of the output dataset is set to 0.
+The output dataset is closed, and the input dataset is closed.
+Finally, the script logs "Done!" to indicate that the operation is complete.
+To use this script, you would run it from the command line and provide the path to the input raster image as an argument. The script will apply the Otsu thresholding method to the input image, save the binary result as "otsu.tif," and preserve the geospatial information from the input image.
+This script is useful for image segmentation tasks, such as separating objects from the background in remote sensing and image processing applications.
+ + + + + + +Purpose: produce a STAC Catalog with a STAC Item describing the detected water body result.
+This step is highlighted below:
+The stac
script is a command-line tool for creating a SpatioTemporal Asset Catalog (STAC) catalog containing detected water bodies.
It takes the STAC Item URLs and corresponding water body GeoTIFFs as input, creates STAC Items for each water body, and organizes them into a STAC Catalog.
+The script utilizes the click
, pystac
, rio_stac
, and loguru
libraries.
Here's an overview of what the script does:
+It defines a command-line interface using the click
library. The script expects multiple input STAC Item URLs and corresponding water body GeoTIFFs as arguments.
The to_stac
function is the main entry point. It creates a new STAC catalog and iterates through the provided STAC Item URLs and water body GeoTIFFs.
For each input pair (STAC Item URL and GeoTIFF), it reads the STAC Item, creates a directory with the same name as the item's ID, and copies the water body GeoTIFF into that directory.
+It uses the rio_stac
library to create a STAC Item for the water body. This includes specifying the source GeoTIFF, input date/time, asset roles, asset href, and more.
The created STAC Items are added to the STAC catalog.
+After processing all input pairs, it saves the STAC catalog to the root directory, specifying it as a self-contained catalog with the catalog type set to pystac.CatalogType.SELF_CONTAINED
.
The script will create a STAC catalog containing the detected water bodies and save it in the current directory.
+The script is executable as a command-line tool as its usage is:
+Usage: app.py [OPTIONS]
+
+ Creates a STAC catalog with the water bodies
+
+Options:
+ --input-item TEXT STAC Item URL [required]
+ --water-body TEXT Water body geotiff [required]
+ --help Show this message and exit.
+
The Python code is provided here:
+water-bodies/command-line-tools/stac/app.py | |
---|---|
1 + 2 + 3 + 4 + 5 + 6 + 7 + 8 + 9 +10 +11 +12 +13 +14 +15 +16 +17 +18 +19 +20 +21 +22 +23 +24 +25 +26 +27 +28 +29 +30 +31 +32 +33 +34 +35 +36 +37 +38 +39 +40 +41 +42 +43 +44 +45 +46 +47 +48 +49 +50 +51 +52 +53 +54 +55 +56 +57 +58 +59 +60 +61 +62 +63 +64 |
|
This application takes as input Copernicus Sentinel-2 or USSG Landsat-9 data and detects water bodies by applying the Otsu thresholding technique on the Normalized Difference Water Index (NDWI).
+The NDWI is calculated with:
+Typically, NDWI values of water bodies are larger than 0.2 and built-up features have positive values between 0 and 0.2.
+Vegetation has much smaller NDWI values, which results in distinguishing vegetation from water bodies easier.
+The NDWI values correspond to the following ranges:
+Range | +Description | +
---|---|
0,2 - 1 | +Water surface | +
0.0 - 0,2 | +Flooding, humidity | +
-0,3 - 0.0 | +Moderate drought, non-aqueous surfaces | +
-1 - -0.3 | +Drought, non-aqueous surfaces | +
To ease the determination of the water surface/non water surface, the Ostu thresholding technique is used.
+In the simplest form, the Otsu algorithm returns a single intensity threshold that separate pixels into two classes, foreground and background. This threshold is determined by minimizing intra-class intensity variance, or equivalently, by maximizing inter-class variance:
+ +The application can be used in two modes:
+This scenario is depicted below:
+This scenario is depicted below:
+Alice packages the application as an Application Package to include a macro workflow that reads the list of Sentinel-2 STAC items references or Landsat-9 staged data, launches a sub-workflow to detect the water bodies and creates the STAC catalog:
+ +The sub-workflow applies the crop
, Normalized difference
, Otsu threshold
steps:
The development and test dataset is made of two Sentinel-2 acquisitions:
+Acquisitions | ++ | + |
---|---|---|
Mission | +Sentinel-2 | +Sentinel-2 | +
Date | +2022-05-24 | +2021-07-13 | +
URL | +S2B_10TFK_20210713_0_L2A | +S2A_10TFK_20220524_0_L2A | +
Quicklook | ++ | + |
And one Landsat-9 acquisition:
+Acquisition | ++ |
---|---|
Date | +2023-10-15 | +
URL | +LC09_L2SP_042033_20231015_02_T1 | +
Quicklook | ++ |
Each Command Line Tool
step such as crop
, Normalized difference
, Otsu threshold
and Create STAC
runs a simple Python script in a dedicated container.
{"use strict";/*!
+ * escape-html
+ * Copyright(c) 2012-2013 TJ Holowaychuk
+ * Copyright(c) 2015 Andreas Lubbe
+ * Copyright(c) 2015 Tiancheng "Timothy" Gu
+ * MIT Licensed
+ */var Ha=/["'&<>]/;Un.exports=$a;function $a(e){var t=""+e,r=Ha.exec(t);if(!r)return t;var o,n="",i=0,s=0;for(i=r.index;i