Skip to main content

Python wrapper for ASF's SearchAPI

Project description

asf_search

PyPI version Conda version

PyPI pyversions PyPI license

CodeFactor Github workflow

CodeCov

Documentation Join the chat at https://gitter.im/ASFDiscovery/asf_search

Python wrapper for the ASF SearchAPI

import asf_search as asf

results = asf.granule_search(['ALPSRS279162400', 'ALPSRS279162200'])
print(results)

wkt = 'POLYGON((-135.7 58.2,-136.6 58.1,-135.8 56.9,-134.6 56.1,-134.9 58.0,-135.7 58.2))'
results = asf.geo_search(platform=[asf.PLATFORM.SENTINEL1], intersectsWith=wkt, maxResults=10)
print(results)

Install

In order to easily manage dependencies, we recommend using dedicated project environments via Anaconda/Miniconda or Python virtual environments.

asf_search can be installed into a conda environment with

conda install -c conda-forge asf_search

or into a virtual environment with

python3 -m pip install asf_search

To install pytest/cov packages for testing, along with the minimal packages:

python3 -m pip install asf_search[test]

Usage

Full documentation is available at https://docs.asf.alaska.edu/asf_search/basics/

Programmatically searching for ASF data is made simple with asf_search. Several search functions are provided:

  • geo_search() Find product info over an area of interest using a WKT string
  • granule_search() Find product info using a list of scenes
  • product_search() Find product info using a list of products
  • search() Find product info using any combination combination of search parameters
  • stack() Find a baseline stack of products using a reference scene
  • Additionally, numerous constants are provided to ease the search process

Additionally, asf_search support downloading data, both from search results as provided by the above search functions, and directly on product URLs. An authenticated session is generally required. This is provided by the ASFSession class, and use of one of its three authentication methods:

  • auth_with_creds('user', 'pass)
  • auth_with_token('EDL token')
  • auth_with_cookiejar(http.cookiejar)

That session should be passed to whichever download method is being called, can be re-used, and is thread safe. Examples:

results = asf_search.granule_search([...])
session = asf_search.ASFSession()
session.auth_with_creds('user', 'pass')
results.download(path='/Users/SARGuru/data', session=session)

Alternately, downloading a list of URLs contained in urls and creating the session inline:

urls = [...]
asf_search.download_urls(urls=urls, path='/Users/SARGuru/data', session=ASFSession().auth_with_token('EDL token'))

Also note that ASFSearchResults.download() and the generic download_urls() function both accept a processes parameter which allows for parallel downloads.

Further examples of all of the above can be found in examples/

Development

Branching

Instance Branch Description, Instructions, Notes
Stable stable Accepts merges from Working and Hotfixes
Working master Accepts merges from Features/Issues and Hotfixes
Features/Issues topic-* Always branch off HEAD of Working
Hotfix hotfix-* Always branch off Stable

For an extended description of our workflow, see https://gist.github.com/digitaljhelms/4287848

Enable Logging

We use standard the standard logging in our package for output.

Heres a basic example for hooking into it with your application:

import asf_search as asf
import logging
ASF_LOGGER = logging.getLogger("asf_search")
formatter = logging.Formatter('[ %(asctime)s (%(name)s) %(filename)s:%(lineno)d ] %(levelname)s - %(message)s')

# Get output to the console:
stream_handle = logging.StreamHandler()
stream_handle.setFormatter(formatter)
ASF_LOGGER.addHandler(stream_handle)
# If you want it write to a file too:
file_handle = logging.FileHandler('MyCustomApp.log')
file_handle.setFormatter(formatter)
ASF_LOGGER.addHandler(file_handle)
# Only see messages that might affect you
ASF_LOGGER.setLevel(logging.WARNING)
# Test if the logger throws an error, you see it as expected:
ASF_LOGGER.error("This is only a drill. Please do not panic.")
# Should output this:
# [ 2023-01-17 10:04:53,780 (asf_search) main.py:42 ] ERROR - This is only a drill. Please do not panic.

For more configure options on logging, please visit their howto page.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

asf_search-8.0.1.tar.gz (811.8 kB view details)

Uploaded Source

Built Distribution

asf_search-8.0.1-py3-none-any.whl (96.9 kB view details)

Uploaded Python 3

File details

Details for the file asf_search-8.0.1.tar.gz.

File metadata

  • Download URL: asf_search-8.0.1.tar.gz
  • Upload date:
  • Size: 811.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.19

File hashes

Hashes for asf_search-8.0.1.tar.gz
Algorithm Hash digest
SHA256 45128f8b12a7b2b67dee9dffcd043f9582e76462dce0f5f095156501fd568596
MD5 1a9b1b62c14183188e15ce93005fbec9
BLAKE2b-256 234a69ccc2188fc0a7915c609c871e1ccbd15702eb8b38f916dfe119cdf1f382

See more details on using hashes here.

File details

Details for the file asf_search-8.0.1-py3-none-any.whl.

File metadata

  • Download URL: asf_search-8.0.1-py3-none-any.whl
  • Upload date:
  • Size: 96.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.19

File hashes

Hashes for asf_search-8.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 18c6263498df292de5cb63be097f9b536912f32c7136b7d17fe48aae6064bc44
MD5 4c16ba8643ed491958c6237ee7287271
BLAKE2b-256 8fae1d372274476cc133128b9cf22f9e7fb545708798633912f9a13e215440d7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page