Skip to main content

Tools to derive HUC-cropped DEMs from common DEM datasets

Project description

DEM to basin preprocessing

This preprocessing script takes source 1-meter digital elevation model (DEM) data and splits, crops, buffers, and reprojects it to individual hydrologic basins (identified by their unique identifier, the "HUC12" ID).

This preprocessing script also produces ancillary data products corresponding to each new HUC12 DEM raster to describe their sub-basins (ie "catchments"), their streams (ie "flowlines"), and the roughness of each streambed.

Taken together, these are the major inputs needed to run GeoFlood, which creates short-term flood projections.

Main Python script

The recommended way to run geoflood-preprocessing-1m-mp.py:

python3 geoflood-preprocessing-1m-mp.py \
    --shapefile study_area_polygon.shp \
    --huc12 WBD-HUC12s.shp \
    --nhd NHD_catchments_and_flowlines.gdb/ \
    --raster TNRIS-LIDAR-Datasets/ \
    --availability TNRIS-LIDAR-Dataset_availability.shp \
    --directory HUC12-DEM_outputs/ \
    --restart geoflood-preprocessing-study_area.pickle

Required source data inputs

There are 5 required inputs.

Optional parameters

  • --directory Outputs directory: a directory to store outputs, which will each be sorted by HUC12
  • --restart Restart file: a Python Pickle file from which you can restart the preprocessing if it's interrupted
  • --overwrite Overwrite flag: optional flag to overwrite all files found in output directory
  • --overwrite_rasters Overwrite rasters flag: optional flag to overwrite just the raster outputs
  • --overwrite_flowlines Overwrite flowlines flag: optional flag to overwrite just the flowline outputs
  • --overwrite_catchments Overwrite catchments flag: optional flag to overwrite just the catchment outputs
  • --overwrite_roughnesses Overwrite roughness table flag: optional flag to overwrite the roughness table
  • --log Log file: a file to store runtime log

Description of outputs

There are 4 outputs per HUC12.

  • Cropped & buffered DEM:
    • buffered 500m
    • cropped to each HUC12 intersecting the study area
    • at least 1m resolution
    • mosaicked with preference for lowest resolution tiles
    • reprojected to the study area's projections
  • corresponding NHD MR flowlines:
    • subset of NHD MR flowlines
    • each flowline's median point along the line lies within the HUC12
    • reprojected to the study area's projections
  • corresponding NHD MR catchments:
    • subset of NHD MR catchments
    • correspond with the NHD MR flowlines above
    • reprojected to the study area's projections
  • Manning's n roughness table:
    • organized by flowline using their ComIDs
    • vary by stream order

Here is an example of these outputs, originally visualized by Prof David Maidment. Example outputs

Already preprocessed DEMs

Already preprocessed DEMs are now available for the vast majority of Texas's HUC12s if you are a TACC user. You can request a TACC account here.

Notes about preprocessed DEMs

  • The DEMs are not provided for any HUC12s that have any gap in 1m resolution data.
  • All of the DEMS are reprojected to WGS 84 / UTM 14N, even if the HUC12 is outside of UTM 14.

Where to find them

The DEMs are located on Stampede2 at /scratch/projects/tnris/dhl-flood-modelling/TX-HUC12-DEM_outputs.

If you run into trouble

Please submit a ticket if you have trouble accessing this data. You may also contact me directly at @dhardestylewis or dhl@tacc.utexas.edu

Available preprocessed HUC12s

These HUC12 DEMs are available right now on Stampede2. Available HUC12 DEMs

Confirmed successfully preprocessed HUC12s

These HUC12 DEMs have been successfully preprocessed in the past, and will soon be available once again on Stampede2. If you need any of these right now, please contact me. Confirmed HUC12 DEMs

Preprocessing workflow

If you would like an understanding of the preprocessing workflow, I provide a simplified but representative example in this Jupyter notebook. This Jupyter notebook was presented at the inaugural TACC Institute on Planet Texas 2050 Cyberecosystem Tools in August, 2020. Please contact me if you would like a recording.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dem2basin-0.2.4.tar.gz (38.1 kB view details)

Uploaded Source

Built Distribution

dem2basin-0.2.4-py3-none-any.whl (36.8 kB view details)

Uploaded Python 3

File details

Details for the file dem2basin-0.2.4.tar.gz.

File metadata

  • Download URL: dem2basin-0.2.4.tar.gz
  • Upload date:
  • Size: 38.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.4 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.1 CPython/3.7.10

File hashes

Hashes for dem2basin-0.2.4.tar.gz
Algorithm Hash digest
SHA256 3803bb18aa0f0439f1cd65abfb0ead04a94c013cc181ef083c8bb449a3742fdf
MD5 6e7d4c1067e1a22dd9418840b72dcee1
BLAKE2b-256 8f489d4bf341cde4fc4481637c665fc1ef61d141c871c19d67741b4a05184f32

See more details on using hashes here.

File details

Details for the file dem2basin-0.2.4-py3-none-any.whl.

File metadata

  • Download URL: dem2basin-0.2.4-py3-none-any.whl
  • Upload date:
  • Size: 36.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.4 pkginfo/1.7.1 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.62.1 CPython/3.7.10

File hashes

Hashes for dem2basin-0.2.4-py3-none-any.whl
Algorithm Hash digest
SHA256 3e1d8a3cf0b0b774ce1358af14a1dbbabb08a5a9a5352007ebc49f2c0ea80878
MD5 c53745fc9886fe9e36b773cbc5781fa5
BLAKE2b-256 9c4abcbcf89650c74c9c6d82de68903fb3e375b314b27aacf0f018b14c3dabca

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page