Skip to main content

New Mexico Water Data Integration Engine

Project description

New Mexico Unified Water Data: Data Integration Engine

Format code Publish Python 🐍 distributions 📦 to PyPI and TestPyPI CI/CD

NMWDI NMBGMR

This package provides a command line interface to New Mexico Water Data Initiaive's Data Integration Engine. This tool is used to integrate the water data from multiple sources.

Installation

pip install nmuwd

Sources

Data comes from the following sources. We are continuously adding new sources as we learn of them and they become available. If you have data that you would like to be part of the Data Integration Engine please get in touch at newmexicowaterdata@nmt.edu.

Source Inclusion & Exclusion

The Data Integration Engine enables the user to obtain groundwater level and groundwater quality data from a variety of sources. Data from sources are automatically included in the output if available unless specifically excluded. The following flags are available to exclude specific data sources:

  • --no-bernco to exclude Bernalillo County (BernCo) data
  • --no-bor to exclude Bureaof of Reclamation (Bor) data
  • --no-nmbgmr-amp to exclude New Mexico Bureau of Geology and Mineral Resources (NMBGMR) Aquifer Mapping Program (AMP) data
  • --no-nmed-dwb to exclude New Mexico Environment Department (NMED) Drinking Water Bureau (DWB) data
  • --no-nmose-isc-seven-rivers to exclude New Mexico Office of State Engineer (NMOSE) Interstate Stream Commission (ISC) Seven Rivers data
  • --no-nmose-roswell to exclude New Mexico Office of State Engineer (NMOSE) Roswell data
  • --no-nwis to exclude USGS NWIS data
  • --no-pvacd to exclude Pecos Valley Artesian Convservancy District (PVACD) data
  • --no-wqp to exclude Water Quality Portal (WQP) data

Water Levels

To obtain groundwater levels, use

weave Waterlevels

followed by the desired output type, source filters, date filters, geographic filters, and excluded data sources.

Water Quality

To obtain groundwater quality, use

weave {analyte}

where {analyte} is the name of the analyte whose data is to be retrieved.

Available Analytes

The following analytes are currently available for retrieval:

  • Arsenic
  • Bicarbonate
  • Calcium
  • Carbonate
  • Chloride
  • Magnesium
  • Nitrate
  • pH
  • Potassium
  • Silica
  • Sodium
  • Sulfate
  • TDS
  • Uranium

Geographic Filters

The following flags can be used to geographically filter data:

-- county {county name}
-- bbox 'x1 y1, x2 y2'

Date Filters

The following flags can be used to filter by dates:

--start-date YYYY-MM-DD 
--end-date YYYY-MM-DD

Output

The data is saved to the current working directory. A log of the inputs and processes, called die.log, is also saved to the current working directory. If a subsquent process is run and the log from the previous process has not been moved or stored elsewhere, the log for the subsequent process will be appended to the existing log.

Timeseries Data

The flag --separated_timeseries exports timeseries for every location in their own file in the directory output_series (e.g. AB-0002.csv, AB-0003.csv).

The flag --unified_timeseries exports all timeseries for all locations in one file titled output.timeseries.csv.

Both time series export a file titled output.sites.csv that contains site information, such as latitude, longitude, and elevation.

Table Headers

The table headers for timeseries data are as follows:

output.sites.csv

  • source: the organization/source for the site
  • id: the id of the site. The id is used as the key to join the output.timeseries.csv table
  • name: the colloquial name for the site if it exists
  • latitude: latitude in decimal degrees
  • longitude: the longitude in decimal degrees
  • elevation ground surface elevation of the site in feet
  • elevation_units: the units of the ground surface elevation. Defaults to ft
  • horizontal_datum: horizontal datum of the latitude and longitude. Defaults to WGS84
  • vertical_datum: the vertical datum of the elevation
  • usgs_site_id: USGS site id if it exists
  • alternate_site_id: alternate site id if it exists
  • formation: geologic formation in which the well terminates if it exists
  • aquifer: aquifer from which the well draws water if it exists
  • well_depth: depth of well if it exists

output.timeseries.csv - waterlevels

  • source: the organization/sources for the site
  • id: the id of the site. The id is used as the key to join the output.sites.csv table
  • depth_to_water_ft_below_ground_surface: depth to water below ground surface in ft
  • date_measured: date of measurement in YYYY-MM-DD format
  • time_measured: time of measurement if it exists

output.timeseries.csv - analytes

  • source: the organization/sources for the site
  • id: the id of the site. The id is used as the key to join the output.sites.csv table
  • parameter: the name of the analyte whose measurements are reported in the table. This corresponds the requested analyte
  • parameter_value: value of the measurement
  • parameter_units: units of the measurement
  • date_measured: date of measurement in YYYY-MM-DD format
  • time_measured: time of measurement if it exists

Summary Data

If neither of the above flags are specified, a summary table called output.csv is exported. The summary table consists of location information as well as summary statistics for the parameter of interest for every location that has observations.

Table Headers: Summary

output.csv - waterlevels and analytes

  • source: the organization/source for the site
  • id: the id of the site. The id is used as the key to join the output.timeseries.csv table
  • location: the colloquial name for the site if it exists
  • usgs_site_id: USGS site id if it exists
  • alternate_site_id: alternate site id if it exists
  • latitude: latitude in decimal degrees
  • longitude: the longitude in decimal degrees
  • horizontal_datum: horizontal datum of the latitude and longitude. Defaults to WGS84
  • elevation ground surface elevation of the site in feet
  • elevation_units: the units of the ground surface elevation. Defaults to ft
  • well_depth: depth of well if it exists
  • well_depth_units: units of well depth. Defaults to ft
  • parameter: the name of the analyte whose measurements are reported in the table. This corresponds the requested analyte
  • parameter_value: value of the measurement
  • parameter_units: units of the measurement
  • nrecords: the number of records for the site
  • min: the minimum record for the site
  • max: the maximum record for the site
  • mean: the mean value for the records at the site
  • most_recent_date: date of most recent record
  • most_recent_time: time of most recent record if it exists
  • most_recent_value the value of the most recent record
  • most_recent_units: the units of the most recent record

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nmuwd-0.2.2.tar.gz (60.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

nmuwd-0.2.2-py3-none-any.whl (77.9 kB view details)

Uploaded Python 3

File details

Details for the file nmuwd-0.2.2.tar.gz.

File metadata

  • Download URL: nmuwd-0.2.2.tar.gz
  • Upload date:
  • Size: 60.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for nmuwd-0.2.2.tar.gz
Algorithm Hash digest
SHA256 bfb96bc7b8a2c316ce10cbd2115c58da0a0d5e4a0c6919c6e41f4e54903204ac
MD5 3900a163f9b28ab95bb081675fdd4bfb
BLAKE2b-256 ae73a9feb0256c8e90eab8622c659b0def8212b8e2f7364ba5dee6e80e170eb3

See more details on using hashes here.

File details

Details for the file nmuwd-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: nmuwd-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 77.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for nmuwd-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 cbd4cfe924ceb3ba5bccf243dc5cb7ee49699ea70c97d2cc5f81643d9ecbf694
MD5 da01aa0ebcd013516772f9c6ed143926
BLAKE2b-256 da697a14fd7a6f934db1c3b7ba07c52e1bbc3ecf98380bc79b6065523cf3243d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page