diff --git a/.nojekyll b/.nojekyll new file mode 100644 index 0000000..e69de29 diff --git a/404.html b/404.html new file mode 100644 index 0000000..862b2ce --- /dev/null +++ b/404.html @@ -0,0 +1,1260 @@ + + + +
+ + + + + + + + + + + + + + + + +Building a Search API query consists of 3 basic steps:
+The completed URL will be in this format: https://api.daac.asf.alaska.edu/services/search/param?keyword1=value1&keyword2=value2,value3&keyword3=value4-6
+Once your query is built, you may execute by copy/pasting into a browser window, a command line interface, or by using a program. More details on various options and some syntax tips can be found in the Search API Tools documentation.
+Downloading Data
+In order to download data, you will need a NASA EOSDIS Earthdata Login account. Earthdata accounts are free. Go to Earthdata Login — Create Profile to create an account.
+You will be prompted to accept the ASF End-User License Agreement and set a Study Area to complete your new user setup.
+Note: A research agreement is required for access to JERS-1 and RADARSAT-1 data. Please complete the required Research Agreement, or contact user support at the email or number below.
+Next Steps
+See Search API Keywords to get started on building a query, or see the Tools page for some examples.
+Alternatively, you may wish to use asf_search, a Python package for performing searches of the ASF catalog. It also offers baseline functionality and download support. Additionally, numerous constants are provided to ease the search process. It is available through PyPi and Conda. More information can be found here.
+ + + + + + + + +The new "dataset" keyword is the preferred alternative for platform searches. It allows results from multiple platforms at once. More information can be found here.
+asf_search is a Python package for performing searches of the ASF catalog. In addition, it offers baseline functionality and download support. It is available through PyPi and Conda. More information can be found here.
+In addition to the Search endpoint, we have multiple endpoints available for all of your Search API needs. Below is a brief overview of what's available. More details on these endpoints and how to use them can be found on the Keywords page.
+Baseline Endpoint
+This endpoint can be used to search for baseline data using specific reference scenes.
+WKT Endpoints
+The WKT validation endpoint will validate and repair a WKT input. The GeoSpatial Files to WKT endpoint will accept a POST request with files attached. It will return the parsed WKT from the file, as well as the repaired wrapped and unwrapped WKT.
+Date Parser Endpoint
+This endpoint can be used to check how dates are parsed by the Search API.
+Mission List Endpoint
+This endpoint lists all missions (also known as campaigns or collections) for all datasets.
+Health Endpoint
+This endpoint is used to check the Search API health. It also provides information on CMR health.
+GeoJSON is the preferred Search API output format. You can specify the output format with keyword "output". If you find a required field that is not included in GeoJSON output, please contact ASF using the info below or reach the team directly at uaf-asf-discovery@alaska.edu.
+ + + + + + + + +This is a collection of some tips & tricks for the Search API!
+asf_search is a Python package for performing searches of the ASF catalog. In addition, it offers baseline functionality and download support. It is available through PyPi and Conda. More information can be found here.
+There has been a rate limitation instituted on the search endpoint. The rate limitation is per IP and is currently 250 queries per minute. Upon hitting the limit, further queries will yield a HTTP 429 with an error message. Check to see if your queries are returning a small number of results. If so, you can refine your parameters to combine result sets into larger groups and then post-process those results locally. For instance, instead of searching on a small area of interest with an individual query for each day, select a larger date range in order to create a single query, then split the results apart after they have been retrieved.
+Have you have completed a geo search in Vertex, that you'd like to replicate in a Search API query? Click the Down Arrow under the Max Results. Choose "API URL...".
+Here you can see the Search API URL you would use to replicate the search. You may change the maxResults and output format. Once you are satisfied, click the copy icon. Now you can paste the query into a browser or command line interface to execute it.
+The product/file name is listed in Vertex Search Results, under the Files detail column. You can click the Copy icon to copy the File ID. You can also copy all File IDs from your Download Queue in Vertex. Once you have your desired list of files, you can find them via the Search API using the product_list keyword.
+You can turn your search results into a search area. First, export your search results as GeoJSON or KML output format. Next, import your file into Vertex geo search. Vertex will extract the AOI from your file. If desired, you can add filters and can save your search filters or the search itself.
+Would you like to verify that your query has returned the correct number of results? Change your output to "output=count" to verify. If the count does not match, consider narrowing your search by using more keywords, or by using keyword “maxResults” to limit it. You may also try shortening the date range to split your search into a series of smaller searches.
+ + + + + + + + +Consider using our new Python package, asf_search. asf_search can be used to perform searches of the ASF catalog, and it offers baseline functionality and download support. Additionally, numerous constants are provided to ease the search process. Currently, we provide constants for platform, instrument, beam mode, flight direction, polarization, and processing level. More information can be found here.
+Keywords are used to find the desired data. Use as many or as few keywords as needed. Available keywords and descriptions are listed below for each Search API endpoint. Keywords are case sensitive.
+Note: Any errors will be returned in JSON format.
+https://api.daac.asf.alaska.edu/services/search/param
+dataset
+platform
+instrument
+absoluteOrbit
+asfframe
+maxBaselinePerp
+minBaselinePerp
+beamMode
+beamSwath
+collectionName
+maxDoppler
+minDoppler
+maxFaradayRotation
+minFaradayRotation
+flightDirection
+flightLine
+frame
+fullBurstID
+granule_list
+groupid
+lookDirection
+maxInsarStackSize
+minInsarStackSize
+offNadirAngle
+operaBurstID
+polarization
+processingLevel
+product_list
+relativeOrbit
+bbox
+intersectsWith
+polygon
+If the AOI specified is its own Minimum Bounding Rectangle (MBR) in a mercator projection, the search results returned will instersect with the AOI in a mercator projection, regardless of width. This remains the case even if the international dateline is crossed within the AOI.
+In order for an AOI to be considered its own MBR, it must meet the following criteria:
+AOIs that do not fit this criteria will have their points connected along great circles.
+In addition, all AOIs are validated, and then simplified as needed. The process for this is:
+Each of these steps is performed only when necessary to get the AOI to a single outline with fewer than 400 points. Any unnecessary steps are skipped.
+Examples of validation and simplification:
+processingDate
+start
+end
+season
+output
+maxResults
+https://api.daac.asf.alaska.edu/services/search/baseline
+reference
+processingLevel
+output
+maxResults
+https://api.daac.asf.alaska.edu/services/utils/wkt
+This endpoint will validate and repair a WKT input. The repaired WKT output is how the Search API will interpret the provided WKT input. If a WKT cannot be repaired, it will return an error stating the reason. All validations and errors are returned in JSON format.
+https://api.daac.asf.alaska.edu/services/utils/files_to_wkt
+This endpoint will accept a POST request with files attached. It will return the parsed WKT from the file, as well as the repaired wrapped and unwrapped WKT. All outputs are returned in JSON format. The preferred file format is geojson, but the Search API will also support other formats, such as shapefile or kml.
+See the Tools page for more details on POST requests.
+https://api.daac.asf.alaska.edu/services/utils/date
+This endpoint can be used to check how dates are parsed by the Search API. All parsed dates are returned in JSON format.
+https://api.daac.asf.alaska.edu/services/utils/mission_list
+This endpoint lists all missions (also known as campaigns or collections) for all datasets. Any of the missions returned in the list may be used as a value for the collectionName keyword in the Search endpoint. The mission list is returned in JSON format.
+https://api.daac.asf.alaska.edu/health
+This endpoint is used to check the Search API health. It is returned in JSON format. There are no keywords associated with the health check endpoint.
+In addition to Search API health, it also returns Search API configs and CMR health status.
+ + + + + + + + +Searches may be executed in a variety of ways, depending on your needs. On this page, you will find syntax & character encoding tips, and further information on some of the ways to run Search API queries.
+Syntax tips
+Character Encoding:
+++space
+replace with '%20'. Use '+' in keyword values
+(
+replace with '%28'
+)
+replace with '%29' +
+
For a complete list of URL codes, please see URL Encoding Reference.
+Escaping Characters
+If you are running Search API queries via command line, you may need to escape characters. Escaping a character tells the command line interface to interpret the character literally. Some characters that need to be escaped include spaces and ampersands (&).
+For more information on escaping characters, please see the Bash Scripting Guide. For Windows users, more information can be found here.
+You may use a program to assist you with Search API queries. This section will provide some details on a few of the programs you can use to write & run Search API queries and some example commands for each.
+ +Both Wget and cURL are often installed on Linux systems. cURL is part of the Mac OS, and Wget can be installed. Microsoft Windows OS does not come with either installed, but both can be downloaded. cURL is easier to set up on a Windows machine. aria2 can be installed on Windows, Mac, or Linux systems.
+aria2c can be used to download results from the Search API with a single command. You will need to include your Earthdata username and password, all desired keywords & values, and ensure that output=metalink.
+Aria2 — Linux/Mac Example - Download Known Scene
+ aria2c --http-auth-challenge=true --http-user=CHANGE_ME --http-passwd='CHANGE_ME' "https://api.daac.asf.alaska.edu/services/search/param?granule_list=S1A_EW_GRDM_1SDH_20151003T040339_20151003T040443_007983_00B2A6_DDE4&output=metalink"
+
+Aria2 — Windows Example - Download Known Scene
+ aria2c --check-certificate=false --http-auth-challenge=true --http-user=CHANGE_ME --http-passwd="CHANGE_ME" "https://api.daac.asf.alaska.edu/services/search/param?granule_list=S1A_EW_GRDM_1SDH_20151003T040339_20151003T040443_007983_00B2A6_DDE4&output=metalink"
+
+Aria2 — Download Based on Platform and Time-Range Search
+ aria2c --http-auth-challenge=true --http-user=CHANGE_ME --http-passwd='CHANGE_ME' "https://api.daac.asf.alaska.edu/services/search/param?platform=Sentinel-1A&intersectsWith=point(-122.425 37.77)&start=2016-07-01T00:00:00&output=metalink"
+
+You can store your login credentials in a config file, instead of including them in every download command.
+aria2 - Linux/Mac Example — Create and use a configuration file
+ echo 'http-user=CHANGE_ME' >> aria2.conf
+ echo 'http-passwd=CHANGE_ME' >> aria2.conf
+ chmod 600 aria2.conf
+
+ aria2c --conf-path=aria2.conf --http-auth-challenge=true "https://api.daac.asf.alaska.edu/services/search/param?granule_list=S1A_EW_GRDM_1SDH_20151003T040339_20151003T040443_007983_00B2A6_DDE4&output=metalink"
+
+Additional aria2 options are available in the aria2 manual.
+Refer to the complete documentation on configuration files for aria2.
+Once you have the download URL, you can download files individually using Wget. You can find the download URL for your desired results by first using outputs csv, json, metalink, or geojson.
+Wget - Linux/Mac Example — Download a file
+ wget -c --http-user=CHANGE_ME --http-password='CHANGE_ME' "https://datapool.asf.alaska.edu/GRD_MD/SA/S1A_EW_GRDM_1SDH_20151003T040339_20151003T040443_007983_00B2A6_DDE4.zip"
+
+Wget - Windows Example — Download a file
+ wget --check-certificate=off -c --http-user=CHANGE_ME --http-password="CHANGE_ME" "https://datapool.asf.alaska.edu/GRD_MD/SA/S1A_EW_GRDM_1SDH_20151003T040339_20151003T040443_007983_00B2A6_DDE4.zip"
+
+ wget -c --http-user=CHANGE_ME --http-password="CHANGE_ME" "https://datapool.asf.alaska.edu/GRD_MD/SA/S1A_EW_GRDM_1SDH_20151003T040339_20151003T040443_007983_00B2A6_DDE4.zip"
+
+You can store your login credentials in a config file, instead of including them in every download command.
+Wget - Linux/Mac Example — Create and use a configuration file
+ echo 'http_user=CHANGE_ME' >> wget.conf
+ echo 'http_password=CHANGE_ME' >> wget.conf
+ chmod 600 wget.conf
+
+ export WGETRC="wget.conf"
+ wget -c "https://datapool.asf.alaska.edu/GRD_MD/SA/S1A_EW_GRDM_1SDH_20151003T040339_20151003T040443_007983_00B2A6_DDE4.zip"
+
+You can also send results to a file on your PC
+Example — query results sent to a metalink file
+ wget -O myfilename.metalink https://api.daac.asf.alaska.edu/services/search/param?intersectsWith=point%28-119.543+37.925%29\&platform=ALOS\&output=metalink
+
+Visualize Example - Mac/Linux
+ wget -O myfilename.kml https://api.daac.asf.alaska.edu/services/search/param?granule_list=ALPSRP074606580,ALPSRP077086550\&output=KML
+
+Download Example - Windows
+ wget -c -O myfilename.metalink https://api.daac.asf.alaska.edu/services/search/param?granule_list=ALPSRP074606580,ALPSRP077086550\&output=METALINK
+
+Additional Wget options are available in the GNU Wget Manual.
+Refer to the complete documentation on configuration files for Wget.
+cURL - Mac/Linux Example
+ curl https://api.daac.asf.alaska.edu/services/search/param?platform=R1\&absoluteOrbit=25234\&output=CSV
+
+cURL - Windows Example
+Note: Copy/pasting quotation marks sometimes causes errors. Delete and re-type the quotes after pasting.
+ curl "https://api.daac.asf.alaska.edu/services/search/param?platform=R1&absoluteOrbit=25234&output=CSV" > myfilename.csv
+
+You can also send results to a file on your PC
+Mac/Linux Example — query results sent to a metalink file
+ curl https://api.daac.asf.alaska.edu/services/search/param?granule_list=ALPSRP074606580,ALPSRP021910740,ALPSRP085800750 >myfilename.metalink
+
+Windows Example — query results sent to a metalink file
+ curl "https://api.daac.asf.alaska.edu/services/search/param?granule_list=ALPSRP074606580,ALPSRP021910740,ALPSRP085800750" > myfilename.metalink
+
+Search Example - Mac/Linux
+ curl https://api.daac.asf.alaska.edu/services/search/param?platform=r1\&asfframe=300\&output=CSV > myfilename.csv
+
+Search Example - Windows
+ curl "https://api.daac.asf.alaska.edu/services/search/param?platform=r1&asfframe=300&output=CSV" > myfilename.csv
+
+Visualize Example - Windows
+ curl "https://api.daac.asf.alaska.edu/services/search/param?granule_list=ALPSRP074606580,ALPSRP077086550&output=KML" >myfilename.kml
+
+Download Example - Windows
+ curl -L "https://api.daac.asf.alaska.edu/services/search/param?granule_list=ALPSRP074606580,ALPSRP077086550&output=METALINK" >myfilename.metalink
+
+Some keywords and endpoints will accept a POST request. The POST examples below are using cURL.
+POST Example - WKT output from file
+ curl -X POST -F 'files=@/path/to/file.geojson' 'https://api.daac.asf.alaska.edu/services/utils/files_to_wkt'
+
+POST Examples - intersectsWith Keyword
+ curl -X POST -F 'intersectsWith=LINESTRING(-97.1191 26.4312,-95.5371 29.1522,-83.7598 29.993,-81.5625 25.4036)' 'https://api.daac.asf.alaska.edu/services/search/param'
+
+You can add additional parameters to your POST request with the -F argument for each desired parameter.
+ curl -X POST -F 'platform=S1' -F 'output=geojson' -F 'maxresults=10' -F 'intersectsWith=POINT(-102.4805 38.7541)' 'https://api.daac.asf.alaska.edu/services/search/param'
+
+For further reading, see POST requests
+You may run the Search API queries directly in a web browser of your choice. Simply copy and paste the query into a web browser. Any errors will be returned in JSON format.
+You will need to use URL encoding for spaces and parentheses. Please refer to the Character Encoding section or see URL Encoding Reference for more details.
+ + + + + + + + +If you are troubleshooting Search API queries, consider using asf_search. asf_search is a Python package for performing searches of the ASF catalog. More information can be found here.
+Trouble Area: Query returns HTTP 429 with error message
+Trouble Area: Windows cURL “unrecognized protocol”
+Trouble Area: Download fails with “401 Unauthorized” or “Authorization failed”
+Trouble Area: Download fails with “401 Unauthorized” or “Authorization failed”
+Trouble Area: Can’t authenticate
+Trouble Area: Search API request with ‘+’ in it fails
+Trouble Area: Search API request fails
+Trouble Area: Search API request hangs, fails, or returns an error
+Trouble Area: Search API returns Validation Error
+Trouble Area: Search API query does not return expected number of results
+Trouble Area: Search API query with "product_list" keyword returns no results
+Trouble Area: Selected output format does not include needed fields
+This class describes a single product from the ASF archive. The class provides metadata, as well as several helpful methods for interacting with the product.
+properties
(dict): Provides product metadata such as Beam Mode, Start Time, etc.geometry
(dict): Describes the product's physical extents as a geojson snippet.baseline
dict: The product's baseline related fields, if available in CMR.umm
(dict): the raw umm json response from CMR used to populate properties
, geometry
, baseline
, and meta
.meta
(dict): the metadata json returned from CMR.ASFProduct.__str__()
utilizes this method for serialization via json.dumps()
args: +None
+returns:
+dict
describing the product as a geojson snippet.Downloads this product to the specified path and optional filename.
+args:
+fileType=asf.FileDownloadType.ADDITIONAL_FILES
to download the XML metadata. To download both .tiff and .xml files for bursts, use asf.FileDownloadType.ALL_FILES
burst_results.download(session=session, path="./", fileType=asf.FileDownloadType.ADDITIONAL_FILES)
returns: +None
+Builds a baseline stack using this product as a reference
+args:
+returns:
+ASFSearchResults
representation of the stack, with the addition of baseline values (temporal, perpendicular) attached to each ASFProduct
Builds search options that describe an InSAR stack based on this product. Similar to stack()
but doesn't perform the search, simply returns ASFSearchOptions
which can be inspected or adjusted and then passed to various search functions.
args: +None
+returns:
+ASFSearchOptions
objectDetermines the centroid of a product.
+args: +None
+returns:
+shapely.geometry.point.Point
object describing the centroid of the productReturns a configured RemoteZip object, which allows downloading selected parts of a product's zip archive.
+For more information on how to use remotezip with asf-search, see the Downloading Single Products
section of the example jupyter notebook. For more information on the open-source remotezip package, check out the remotezip project repo.
args:
+session
ASFSession: An authenticated ASFSession object that will be used to download the productreturns:
+remotezip.RemoteZip
object authenticated with the passed ASFSession objectThis class describes a set of search parameters. While it is not required to use this class when constructing a search, it can be useful, as it provides some degree of immediate parameter validation, as well as a convenient way to manipulate and handle search options in general.
+Specific search parameters are handled as object attributes. Attempting to add an attribute that is not supported will raise a KeyError. Attempting to delete an attribute will result in it being set to None. Search parameters can be set via kwargs at object instantion, or directly on an existing object using the normal mechanisms.
+Converting to a dict
will only include search options which have actually been set to a usable value. That is, any options set to None
will be ignored.
ASFSearchOptions does not provide any methods intended for direct use, instead relying on a handful of dunders for the desired behavior. For clarity, these are included below.
+Establishes the various attributes described above and processes any kwargs into them.
+args:
+KeyError
returns: +None
+Sets the attribute named by key
to the specified value
after passing it through an appropriate validator function.
Values of None
are allowed as a way to un-set the attribute. Attempting to set a key
not listed in the above attribute list will raise a KeyError
args:
+returns: +None
+Clears an attribute names by item
by way of setting it to None
args:
+returns: +None
+Used when converting the ASFSearchOptions object to more fundamental objects, such as dict
Only includes attributes that are not None
.
args: +None
+yields:
+None
This class describes a set of search results from the ASF archive. The class provides a convenient way to manage and examine search results, as well as export and download functionality.
+searchOptions
(ASFSearchOptions): The search options used to generate this set of results. May be None
in some cases.searchComplete
(bool): Flag signifying asf_search.search()
sucessfully completed gathering results from CMR. Iterates over each ASFProduct
and downloads them to the specified path.
args:
+fileType=asf.FileDownloadType.ADDITIONAL_FILES
to download the XML metadata. To download both .tiff and .xml files for bursts, use asf.FileDownloadType.ALL_FILES
burst_results.download(session=session, path="./", fileType=asf.FileDownloadType.ADDITIONAL_FILES)
returns: None
+ASFSearchResults.__str__()
utilizes this method for serialization via json.dumps()
args: None
+returns:
+dict
describing the search results as a geojson object.Creates a csv formatted string generator from the results
+args: None
+returns:
+Creates a kml formatted string generator from the results
+args: None
+returns:
+Creates a metalink formatted string generator from the results
+args: None
+returns:
+Use to check if results returned from asf_search.search()
are incomplete (this can happen
+if an error occurs while querying CMR)
args: None
+raises:
+asf_search.exceptions.ASFSearchError
if the results are incompleteThis class extends requests.session
to provide convenient ASF-specific authorization options. ASFSession
is a subclass of Session
. More information can be found here
Authenticates the session (self) using Earthdata Login username/password credentials.
+args:
+returns:
+Authenticates the session (self) using an Earthdata Login Authorization: Bearer
token.
args:
+returns:
+Authenticates the session (self) using a pre-existing cookiejar.
+args:
+http.cookiejar
compatible objectreturns:
+In addition to covering best practices, this page also contains advanced search techniques and serves as the "philosophy of asf_search".
+Topics covered include:
+This section contains information on result sets, general performance, the different search types available, common filter examples, and count.
+Search results are returned as an ASFSearchResults
object, a sublass of User List
, containing a list of ASFProduct
objects. Each of these classes provides some additional functionality to aid in working with the results and individual products.
+ASFProduct
provides a number of metadata fields, such as:
Geographic coordinates are stored in the geometry attribute:
+results[0].geometry
Other metadata is available through the properties attribute:
+results[0].properties
ASFProduct
objects provides geojson-based serialization, in the form of a geojson feature snippet:
print(results[0])
ASFSearchResults
also supports the following output formats:
When searching for multiple products it's faster to search all products at once in a single search, rather than running a separate query for each product, which involves multiple https requests.
+import asf_search as asf
+
+granules = ['S1B_IW_GRDH_1SDV_20161124T032008_20161124T032033_003095_005430_9906', 'S1-GUNW-D-R-087-tops-20190301_20190223-161540-20645N_18637N-PP-7a85-v2_0_1', 'ALPSRP111041130']
+
+# THIS IS SLOW AND MAKES MORE NETWORK REQUESTS THAN NECESSARY
+batched_results = ASFSearchResults([])
+for granule in granules:
+ unbatched_response = asf.granule_search(granules_list=granule)
+ batched_results.extend(batched_results)
+
+# THIS WILL ALWAYS BE FASTER
+fast_results = asf.granule_search(granules_list=granules)
+
+If you need to perform intermediate operations on large results (such as writing metadata to a file or calling some external process on results), use the search_generator()
method to operate on results as they're returned page-by-page (default page size is 250).
import asf_search as asf
+
+opts = asf.ASFSearchOptions(platform=asf.DATASET.SENTINEL1, maxResults=1000)
+
+for page in asf.search_generator(opts=opts):
+ foo(page)
+
+To see details on different search types, see the Searching section.
+Search options can be specified using kwargs, which also allows them to be handled using a dictionary:
+opts = {
+ 'platform': asf.PLATFORM.ALOS,
+ 'start': '2010-01-01T00:00:00Z',
+ 'end': '2010-02-01T23:59:59Z'
+}
+
+Below are some common filter examples:
+results = asf.geo_search(
+ intersectsWith='POLYGON((-91.97 28.78,-88.85 28.78,-88.85 30.31,-91.97 30.31,-91.97 28.78))',
+ platform=asf.PLATFORM.UAVSAR,
+ processingLevel=asf.PRODUCT_TYPE.METADATA,
+ maxResults=250)
+
+You may use the search_count()
method to return the count of total results matching the passed search options.
This example returns the current size of the SENTINEL1 catalog:
+opts = {
+'platform': asf.PLATFORM.SENTINEL1}
+count = asf.search_count(**opts)
+
+Constants are provided for each dataset. The list of constants can be found here.
+Basic dataset search example:
+sentinel_results = asf.search(dataset=asf.DATASET.SENTINEL1, maxResults=250)
+
+You can view the metadata for your results via the properties dictionary:
+sentinel_results[0].properties
+
+Or you can view the metadata as a geojson formatted dictionary:
+sentinel_results.geojson()
+
+asf_search supports searching for lists of short names by the shortName
keyword.
+The currently available NISAR data that CMR provides lacks searchable additional attributes.
+Therefore, the best way to search for NISAR results is via combinations of shortName
, dataset
, platform
, and granule_list
/product_list
keywords.
NISAR example:
+nisar_gslc_gunw = asf.search(shortName=['NISAR_L2_GSLC_V1', 'NISAR_L2_GUNW_V1'], opts=search_opts, maxResults=250)
+print(nisar_gslc_gunw)
+
+The Opera dataset has both standard products and CalVal (calibration/validation) products available. +Please note that the CalVal products are treated as their own dataset in asf_search. +Both can be found in the constants list.
+The SLC Burst dataset has both tiff and xml data associated with a single entry in CMR. To access the xml data, +see the section on downloading additional files.
+fullBurstID
, relativeBurstID
, and absoluteBurstID
are SLC Burst specific filters. To
+get a temporal stack of products over a single burst frame, use fullBurstID
, which is shared between
+all bursts over a single frame.
For more information on the constants and keywords available, see the Keywords section.
+This section contains information on granule and product searches, secondary searches, + and other search details.
+granule_search()
and product_search()
are similar.
+Granule (also called a scene) searches include all files types for the specified granule, whereas product searches specify one file type.
+Granule searches can be 1:many, whereas a product search will always be 1:1.
Granule search example:
+granule_list = [
+ 'S1B_IW_GRDH_1SDV_20190822T151551_20190822T151616_017700_0214D2_6084',
+ 'S1B_IW_GRDH_1SDV_20190810T151550_20190810T151615_017525_020F5A_2F74',
+ 'S1B_IW_GRDH_1SDV_20190729T151549_20190729T151614_017350_020A0A_C3E2',
+ 'S1B_IW_GRDH_1SDV_20190717T151548_20190717T151613_017175_0204EA_4181',
+ 'S1B_IW_GRDH_1SDV_20190705T151548_20190705T151613_017000_01FFC4_24EC',
+ 'S1B_IW_GRDH_1SDV_20190623T151547_20190623T151612_016825_01FA95_14B9',
+ 'S1B_IW_GRDH_1SDV_20190611T151546_20190611T151611_016650_01F566_D7CE',
+ 'S1B_IW_GRDH_1SDV_20190530T151546_20190530T151611_016475_01F02E_BF97',
+ 'S1B_IW_GRDH_1SDV_20190518T151545_20190518T151610_016300_01EAD8_9308',
+ 'S1B_IW_GRDH_1SDV_20190506T151544_20190506T151609_016125_01E56C_1D67'
+]
+results = asf.granule_search(granule_list)
+print(results)
+
+Product search example:
+product_list = [
+ 'S1A_IW_GRDH_1SDV_20190809T001336_20190809T001401_028485_033839_78A1-GRD_HD',
+ 'S1A_IW_GRDH_1SDV_20150322T000454_20150322T000524_005137_006794_56E3-GRD_HD',
+ 'S1A_IW_GRDH_1SDV_20160121T001256_20160121T001321_009585_00DF26_5B84-GRD_HD',
+ 'S1A_IW_GRDH_1SDV_20151117T000448_20151117T000513_008637_00C455_3DC2-GRD_HD'
+]
+results = asf.product_search(product_list)
+print(results)
+
+granule_search()
and product_search()
do not make use of any other search filters, but will accept kwargs for consistency with other search functions:
results = asf.granule_search(granule_list=granule_list)
+print(f'{len(results)} results found')
+
+It is generally preferred to "collapse" many small queries into fewer large queries. That is, it may be easy and logically reasonable to run a number of small granule_search()
queries via a foreach
loop over each of the items in the original granule list. Please do not do this. It consumes a lot of resources at both ASF and at CMR.
Instead, combine your small queries into a single large query where possible, as shown above, and then post-process the results locally. granule_search()
and product_search()
can support very large lists, and will break them up internally when needed.
When using the frame
keyword with certain platforms/datasets, asf_search will implicitly swap to using the asfframe
keyword instead at search time. The platforms/datasets this affects are:
SENTINEL-1A/B
ALOS
In the query to CMR, this means searching by the FRAME_NUMBER
instead of the CENTER_ESA_FRAME
additional attribute.
+A way to avoid this on searches and use CENTER_ESA_FRAME
with the above platforms/datasets is to use cmr_keywords
:
asf.search(platform=asf.PLATFORM.SENTINEL1, cmr_keywords=[('attribute[]', 'int,CENTER_ESA_FRAME,1001')], maxResults=250)
Once you have identified a result set or a product id, you may wish to build a baseline stack based on those results.
+You may use either the stack()
or stack_from_id()
methods to accomplish this.
stack_from_id()
is provided largely as a convenience: internally, it performs a product_search()
using the provided ID, and then returns the results of that product's stack()
method.
+For this reason, it is recommended that if you have an ASFProduct
object at hand, you use that to build your stack directly, as it removes the need for the additional search action.
+For other cases where you have parameters describing your reference scene but not an ASFProduct
object itself, it is appropriate to use one of the various search features available to obtain an ASFProduct
first.
A basic example using ASFProduct.stack()
:
import asf_search as asf
+
+reference = asf.product_search('S1A_IW_SLC__1SDV_20220215T225119_20220215T225146_041930_04FE2E_9252-SLC')[0]
+
+print(reference.stack())
+
+The results are a standard ASFSearchResults
object containing a list of ASFProduct
objects, each with all the usual functionality.
+There are 2 additional fields in the ASFProduct
objects: temporalBaseline
and perpendicularBaseline
.
+temporalBaseline
describes the temporal offset in days from the reference scene used to build the stack.
+perpendicularBaseline
describes the perpendicular offset in meters from the reference scene used the build the stack.
+The reference scene is included in the stack and will always have a temporal and perpendicular baseline of 0.
asf_search provides 2 major keywords with subtle differences:
+platform
dataset
platform
maps to the platform[]
CMR keyword; values like Sentinel-1A
, UAVSAR
, ALOS
. A limitation of searching by
+platform is that for platforms like Sentinel-1A
there are a lot of Sentinel-1 derived product types (OPERA-S1
, SLC-BURST
).
+For every SLC
product, there are 27 additional OPERA-S1
and SLC-BURST
products, which can lead to homogeneous results depending on your search filters.
The dataset
keyword serves as a solution for this. Each "dataset" is a collection of concept ids generally associated with commonly used datasets.
# At the time of writing will likely contain mostly `OPERA-S1` and/or `SLC-BURST` products
+platform_results = asf.search(dataset=asf.PLATFORM.SENTINEL1, maxResults=250)
+
+# Will contain everything but `OPERA-S1` and/or `SLC-BURST` products
+dataset_results = asf.search(dataset=asf.DATASET.SENTINEL1, maxResults=250)
+
+# Will contain OPERA-S1 Products
+opera_results = asf.search(dataset=asf.DATASET.OPERA_S1, maxResults=250)
+
+# Will contain SLC-BURST products
+slc_burst_results = asf.search(dataset=asf.DATASET.SLC_BURST, maxResults=250)
+
+asf_search defaults to querying against the production CMR API, cmr.earthdata.nasa.gov
.
+In order to use another CMR host, set the host
keyword with ASFSearchOptions
.
uat_opts = asf.ASFSearchOptions(host='cmr.uat.earthdata.nasa.gov', maxResults=250)
+uat_results = asf.search(opts=uat_opts)
+
+asf_search provides a built in method for searching for campaigns via platform.
+asf.campaigns(platform=asf.PLATFORM.SENTINEL1A)
asf_search aliases the following keywords behind the scenes with corresponding collection concept ids for improved search performance:
+platform
processingLevel
The Alias lists are updated as needed with each release, but if you're not finding expected results, then the alias list may be out of date.
+In order to skip the aliasing step, set the collectionAlias
keyword to false with ASFSearchOptions
opts = asf.ASFSearchOptions(collectionAlias=False, maxResults=250)
+unaliased_results = asf.search(opts=opts)
+
+Please note, this will result in slower average search times. If there are any results missing from new datasets, please report it as an issue in github with the concept id and name of the collection missing from the dataset.
+This Jupyter notebook covers the available authentication methods. +Once authenticated, it provides a workflow for downloading search results.
+Using .netrc credentials is the preferred method for authentication.
+This guide will show you how to set up a .netrc file.
+Requests will attempt to get the authentication credentials for the URL’s hostname from your .netrc file.
+The .netrc file overrides raw HTTP authentication headers set with headers=
.
+If credentials for the hostname are found, the request is sent with HTTP Basic Auth.
Below you will find recommendations for advanced search techniques, such as subclassing, authentication, and the preferred method for large searches.
+Sentinel-1 products as well as most Sentinel-1 derived datasets (OPERA-S1, SLC-Burst) have a group id associated with them.
+This means that getting the original source scene, or any product associated with that scene, is as simple as using the groupID
+keyword in a search.
import asf_search as asf
+
+burst_name = 'S1_279916_IW1_20230418T162849_VV_A7E1-BURST'
+burst_granule = asf.search(granule_list=['S1_279916_IW1_20230418T162849_VV_A7E1-BURST'])[0]
+
+groupID = burst_granule.properties['groupID']
+
+# gets the parent SLC of the burst product
+parent_slc = asf.search(groupID=groupID, processingLevel=asf.PRODUCT_TYPE.SLC)[0]
+
+# gets all other SLC Bursts associated with the same parent SLC
+bursts_in_same_scene = asf.search(groupID=groupID, processingLevel=asf.PRODUCT_TYPE.BURST)
+
+# gets ALL Sentinel-1 products and derived products available for the parent scene
+all_products_for_scene = asf.search(groupID=groupID)
+
+ASFProduct
is the base class for all search result objects.
+There are several subclasses of ASFProduct
that are used for specific platforms and product types with unique properties/functionality.
Key Methods:
+geojson()
download()
stack()
get_stack_opts()
(returns None in ASFProduct
, implemented by ASFStackableProduct
subclass and its subclasses)centroid()
remotezip()
(requires optional dependency to be installed)get_property_paths()
(gets product's keywords and their paths in umm dictionary)translate_product()
(reads properties from umm, populates properties
with associated keyword)get_sort_keys()
umm_get()
Key Properties:
+properties
_base_properties
(what get_property_paths()
uses to find values in umm JSON properties
)umm
(the product's umm JSON from CMR)metadata
(the product's metadata JSON from CMR)ASFStackableProduct
is an important ASFProduct
subclass, from which stackable product types meant for time series analysis are derived.
+ASFStackableProduct
has a class enum, BaselineCalcType
, that determines how perpendicular stack calculations are handled.
+Each subclass keeps track of their baseline calculation type via the baseline_type
property.
Inherits: ASFProduct
Inherited By: ALOSProduct
; ERSProduct
; JERSProduct
; RADARSATProduct
; S1Product
;
+S1BurstProduct
; OPERAS1Product
, ARIAS1GUNWProduct
Key Methods:
+get_baseline_calc_properties()
get_stack_opts()
(overrides ASFproduct
)is_valid_reference()
get_default_baseline_product_type()
Key Definitions for class enum BaselineCalcType
:
PRE_CALCULATED
: has pre-calculated insarBaseline
value that will be used for perpendicular calculationsCALCULATED
: uses position/velocity state vectors and ascending node time for perpendicular calculationsKey Fields:
+baseline
baseline_type
(BaselineCalcType.PRE_CALCULATED
by default or BaselineCalcType.CALCULATED
)Because ASFProduct
is built for subclassing, that means you can provide your own custom subclasses derived directly from ASFProduct
or even from a pre-existing subclass like S1Product
or OperaS1Product
.
For more information on subclassing, see the Jupyter notebook.
+Downloading data, and accessing some data, requires an authenticated session with Earthdata Login.
+To simplify this workflow, the ASFSession
class is provided.
auth_with_creds()
+auth_with_token()
+auth_with_cookiejar()
+
+Creating an authenticated session example:
+from getpass import getpass
+session = asf.ASFSession()
+session.auth_with_creds(input('EDL Username'), getpass('EDL Password'))
+
+The ASFSearchOptions
class is provided for storing and validating search parameters.
+Creating an ASFSearchOptions
object is required to pass our authenticated session to search()
.
search_opts = asf.ASFSearchOptions(
+dataset=asf.DATASET.NISAR,
+session=session)
+
+nisar_response = asf.search(opts=search_opts, maxResults=250)
+
+The recommended way to perform large, long-running searches is to use search_generator()
to yield CMR results page by page.
+This allows you to stream results to a file in the event CMR times out.
+Different output formats can be used.
Note that asf_search queries CMR with page sizes of 250, so setting maxResults=500 means asf_search will have to query CMR twice, each time returning 250 products:
+large_results_generator = asf.search_generator(maxResults=500, platform=asf.PLATFORM.SENTINEL1A)
with open("search_results.metalink", "w") as f:
+ f.writelines(asf.export.results_to_metalink(large_results_generator))
+
+Another usage example:
+import asf_search as asf
+opts = asf.ASFSearchOptions(shortName='ARIA_S1_GUNW')
+urs = []
+for page in asf.search_generator(opts=opts):
+ urs.extend(product.properties['fileID'] for product in page)
+ print(len(urs))
+
+Some product types, such as SLC Bursts or Opera-S1 products, have several files that can be downloaded.
+We can specify which files to download by setting the fileType
and using the FileDownloadType
enum class.
Additional files are stored in this array:
+product.properties['additionalUrls']
+
+To download only the additional files:
+FileDownloadType.ADDITIONAL_FILES # everything in 'additionalUrls'
+
+To download the default file:
+FileDownloadType.DEFAULT_FILE # The default data file, 'url'
+
+To download both:
+FileDownloadType.ALL_FILES # all of the above
+
+This example will download all additional files under the additionalUrls
attribute:
cslc_results[0].download(session=session, path = './', fileType=asf.FileDownloadType.ADDITIONAL_FILES)
+
+To be more specific, we can use the download_urls()
or download_url()
methods
print(f"Additional urls: {opera_results[0].properties['additionalUrls']}")
+
+url = opera_results[0].properties['additionalUrls'][0]
+fileName = url.split('/')[-1]
+
+asf.download_url(url, session=session, path ='./', filename=fileName)
+
+Some product types (Sentinel-1, BURST, OPERA, NISAR) have s3 direct access URIs available. They are accessible under the s3Urls
properties key:
ASFProduct.properties['s3Urls']
.
You can also search for granules using readable_granule_name
via pattern matching.
To do this, you can pass the CMR search keyword config directly with the cmr_keywords
search parameter.
+This allows you to pass any valid CMR keyword-value pair that isn't covered by asf_search directly, as well as configure existing parameter behavior.
More info on pattern matching and parameter options can be found here.
+Example:
+gslc_results = asf.search(granule_list=['*046_009_A_095*'], cmr_keywords=('options[readable_granule_name][pattern]', 'true'), opts=search_opts)
+
+for product in gslc_results:
+ print(product.properties['fileID'])
+
+
+
+
+
+
+
+
+
+ asf_search is a Python package for performing searches of the ASF catalog. In addition, it offers baseline functionality and download support. It is available through PyPi and Conda.
+import asf_search as asf
+
+results = asf.granule_search(['ALPSRS279162400', 'ALPSRS279162200'])
+print(results)
+
+wkt = 'POLYGON((-135.7 58.2,-136.6 58.1,-135.8 56.9,-134.6 56.1,-134.9 58.0,-135.7 58.2))'
+results = asf.geo_search(platform=[asf.PLATFORM.SENTINEL1], intersectsWith=wkt, maxResults=10)
+print(results)
+
+For an introductory walkthrough of asf_search, see the Jupyter Notebooks.
+In order to easily manage dependencies, we recommend using dedicated project environments via Anaconda/Miniconda or Python virtual environments.
+asf_search can be installed into a conda environment with
+conda install -c conda-forge asf_search
+
+or into a virtual environment with
+python -m pip install asf_search
+
+Programmatically searching for ASF data is made simple with asf_search. Several search functions are provided. Each search function returns an ASFSearchResults
object:
geo_search()
Find product info over an area of interest using a WKT stringgranule_search()
Find product info using a list of scene namesproduct_search()
Find product info using a list of product IDsstack_from_id()
Find a baseline stack of products using a reference scenesearch()
supports all available keywords:search()
Find product info using any combination of search parametersAdditionally, asf_search supports downloading data, both from search results as provided by the above search functions, and directly on product URLs. An authenticated session is generally required. More information on available authentication methods can be found here. You may also authenticate using an ASFSession
object and one of the following authentication methods. ASFSession
is a subclass of Session
.
auth_with_creds('user', 'pass)
auth_with_token('EDL token')
auth_with_cookiejar(http.cookiejar)
If not using .netrc credentials, that session should be passed to whichever download method is being called, can be re-used, and is thread safe.
+Example using .netrc:
+results = ....
+results.download(path='....')
+
+Example with manual authentication:
+results = asf_search.granule_search([...])
+session = asf_search.ASFSession().auth_with_creds('user', 'pass')
+results.download(path='/Users/SARGuru/data', session=session)
+
+Alternately, asf_search supports downloading an arbitrary list of URLs. All of the available authentication methods are supported:
+urls = [...]
+asf_search.download_urls(urls=urls, path='/Users/SARGuru/data', session=ASFSession().auth_with_token('EDL token'))
+
+Also note that ASFSearchResults.download()
and the generic download_urls()
function both accept a processes
parameter which allows for parallel downloads.
Further examples of all of the above can be found in this sample script.
+ + + + + + + + +asf_search supports downloading data, both from search results as provided by the search functions, and directly on product URLs. An authenticated session is generally required. asf_search uses Requests
. Using .netrc credentials is the preferred method for authentication. More information on .netrc authentication can be found here.
Example using .netrc:
+results = ....
+results.download(path='....')
+
+If not using .netrc credentials, you may authenticate using an ASFSession
object and one of the following authentication methods. ASFSession
is a subclass of Session
. The session should be passed to whichever download method is being called, can be re-used, and is thread safe.
auth_with_creds('user', 'pass)
auth_with_token('EDL token')
auth_with_cookiejar(http.cookiejar)
Example with manual authentication:
+results = asf_search.granule_search([...])
+session = asf_search.ASFSession().auth_with_creds('user', 'pass')
+results.download(path='/Users/SARGuru/data', session=session)
+
+asf_search also supports downloading an arbitrary list of URLs. All of the available authentication methods are supported:
+urls = [...]
+asf_search.download_urls(urls=urls, path='/Users/SARGuru/data', session=ASFSession().auth_with_token('EDL token'))
+
+Also note that ASFSearchResults.download()
and the generic download_urls()
function both accept a processes
parameter which allows for parallel downloads.
Downloads all products from the specified URLs to the specified location.
+args
+Downloads a product from the specified URL to the specified location and (optional) filename.
+args
+Configures and returns an authenticated remotezip.RemoteZip
object, allowing downloading of
+specific files from a given zip archive without downloading the entire archive.
args
+ASFSession
that RemoteZip will use to download from the zip productreturns:
+remotezip.RemoteZip
object authenticated with the passed ASFSession objectasf_search provides multiple export formats, in addition to the default asf_search format. Available formats are: geojson, csv, metalink, kml, jsonlite, jsonlite2.
+Examples:
+results = ....
+with open("search_results.csv", "w") as f:
+ f.writelines(results.csv())
+
+results = ....
+with open("search_results_jsonlite.json", "w") as f:
+ f.writelines(results.jsonlite())
+
+
+
+
+
+
+
+
+
+ ASFError(Exception):
+ASFSearchError(ASFError):
+ASFSearch4xxError(ASFSearchError):
+ASFSearch5xxError(ASFSearchError):
+ASFServerError(ASFSearchError):
+ASFBaselineError(ASFSearchError):
+ASFDownloadError(ASFError):
+ASFAuthenticationError(ASFError):
+Each search function returns an ASFSearchResults
object:
geo_search()
Find product info over an area of interest using a WKT stringgranule_search()
Find product info using a list of scene namesproduct_search()
Find product info using a list of product IDsstack_from_id()
Find a baseline stack of products using a reference scene IDsearch()
supports all available keywords:search()
Find product info using any combination combination of search parameters. See the keywords list below.Examples of some search workflows can be found in this sample script. You may also reference the Jupyter notebooks for example workflows.
+For more advanced usage, see sections ASFSearchResults class and ASFProduct class.
+Keywords are used to find the desired data. Use as many or as few keywords as needed. Available keywords and descriptions are listed below. Additionally, numerous constants are provided to ease the search process. Currently, we provide constants for beam mode, flight direction, instrument, platform, polarization, and product type. You can see the full list of constants here.
+dataset
+help(asf_search.constants.DATASET)
platform
+help(asf_search.constants.PLATFORM)
instrument
+help(asf_search.constants.INSTRUMENT)
absoluteBurstID
+absoluteOrbit
+asfFrame
+beamMode
+help(asf_search.constants.BEAMMODE)
beamSwath
+campaign
+asf_search.campaigns()
function. You must provide the desired platform.asf_search.campaigns(asf_search.PLATFORM.UAVSAR)
maxDoppler
+minDoppler
+maxFaradayRotation
+minFaradayRotation
+flightDirection
+help(asf_search.constants.FLIGHT_DIRECTION)
flightLine
+frame
+fullBurstID
+groupID
+lookDirection
+offNadirAngle
+operaBurstID
+polarization
+help(asf_search.constants.POLARIZATION)
processingLevel
+help(asf_search.constants.PRODUCT_TYPE)
relativeBurstID
+relativeOrbit
+If the AOI specified is its own Minimum Bounding Rectangle (MBR) in a mercator projection, the search results returned will instersect with the AOI in a mercator projection, regardless of width. This remains the case even if the international dateline is crossed within the AOI.
+In order for an AOI to be considered its own MBR, it must meet the following criteria:
+AOIs that do not fit this criteria will have their points connected along great circles.
+In addition, all AOIs are validated, and then simplified as needed. The process for this is:
+Each of these steps is performed only when necessary to get the AOI to a single outline with fewer than 400 points. Any unnecessary steps are skipped.
+Examples of validation and simplification:
+processingDate
+start
+end
+season
+{"use strict";/*!
+ * escape-html
+ * Copyright(c) 2012-2013 TJ Holowaychuk
+ * Copyright(c) 2015 Andreas Lubbe
+ * Copyright(c) 2015 Tiancheng "Timothy" Gu
+ * MIT Licensed
+ */var _a=/["'&<>]/;Pn.exports=Aa;function Aa(e){var t=""+e,r=_a.exec(t);if(!r)return t;var o,n="",i=0,s=0;for(i=r.index;i