Skip to main content

High level python interface to Micaps MDFS Data

Project description

A high level and easy-to-use Micaps MDFS data online reader package.

It contains main features as following,

  1. Online client to read data from GDS server

  2. Read Micaps diamond (write support) and Micaps 4 Grid/Stations files.

  3. Read satellite product data file (AWX)

  4. Read weather radar mosaic product file (.LATLON)

  5. Filter stations data or clip grid data

  6. Major data structures are pandas.DataFrame/xarray.DataArray

README

Install

use pip install pymdfs

pip install pymdfs

Quick Start

Read data from Micaps GDS server

The most useful class in pymdfs is MdfsClient, you can use it to fetch data from GDS server, clip longitude and latitude extent.

Key Point

  • The first argument of MdfsClient is GDS server address and port.

  • MdfsClient.sel is the frontend interface to fetch data in GDS, using several arguments,

    • datasource, top directory name in GDS server

    • inittime, initial datetime of model or observation datetime,

    • fh, forecast hour of model, only valid for model data

    • varname, variable name, / joined middle directories

    • level, model pressure level, only valid for model data

    • lat, slice extent for latitude

    • lon, slice extent for longitude

    • wildcard, file name wildcard, runtime can be speedup if offered

Following is an example to fetch 0.125x0.125 ECMWF forecasted relative humidity field, initial at 2023-02-20 20:00 (BT) and lead at 24 hours later.

from datetime import datetime
from pymdfs import MdfsClient

gds = MdfsClient('xxx.xxx.xxx.xxx:xxxx')
dar = gds.sel('ECMWF_HR', datetime(2023, 2, 20, 20), fh=24, varname='RH',
              level=850, lat=slice(20, 40), lon=slice(110, 130))
print(dar)

Following is an example to fetch

from datetime import datetime
from pymdfs import MdfsClient

gds = MdfsClient('xxx.xxx.xxx.xxx:xxxx')
df = gds.sel('SURFACE', datetime(2023, 2, 20, 20), varname='RAIN24_ALL_STATION',
             lat=slice(20, 40), lon=slice(110, 130))
print(df)

Command line procedures

1. client_query

usage:

mdfs_query [-h] [-s SERVER] [-o LOGLEVEL] datasource

MDFS Data Query

positional arguments:

datasource, data source name

optional arguments:

arguments

Description

-h, –help

show this help message and exit

-s SERVER, –server SERVER

GDS server address

-o LOGLEVEL, –loglevel LOGLEVEL

loglevel: 10, 20, 30, 40, 50

Example:

client_query ECMWF_HR

2. client_dump

usage:

mdfs_dump [-h] [-f FH] [-e OUTFILE] [-c COMPLEVEL] [-v VARNAME] [-x LON] [-y LAT] [-p LEVEL] [-t OFFSET_INITTIME] [–name_map NAME_MAP] [-s SERVER] [-o LOGLEVEL] datasource inittime

MDFS Data Dumper

positional arguments:

arguments

Description

datasource

data source name

inittime

model initial datetime or observation datetime

optional arguments:

arguments

Description

-h, –help

show this help message and exit

-f FH, –fh FH

model forecast hour

-e OUTFILE, –outfile OUTFILE

output netcdf file name

-c COMPLEVEL, –complevel COMPLEVEL

output netcdf4 compress level

-v VARNAME, –varname VARNAME

model variable names

-x LON, –lon LON

longitude point or range

-y LAT, –lat LAT

latitude point or range

-p LEVEL, –level LEVEL

pressure level point or range

-t OFFSET_INITTIME, –offset-inittime OFFSET_INITTIME

offset inittime (hours) to variable

–name_map NAME_MAP

map variable name to new

-s SERVER, –server SERVER

GDS server address

-o LOGLEVEL, –loglevel LOGLEVEL

logger level in number

Example:

client_dump ECMWF_HR 2023021920 -f 24 --level 500 -v RH,UGRD,VGRD,TMP,HGT -e ECMWF_HR.2023021920.nc

More details and features please go to the docs hosted at readthedocs .

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pymdfs-0.1.tar.gz (47.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pymdfs-0.1-py3-none-any.whl (69.1 kB view details)

Uploaded Python 3

File details

Details for the file pymdfs-0.1.tar.gz.

File metadata

  • Download URL: pymdfs-0.1.tar.gz
  • Upload date:
  • Size: 47.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.11.3 pkginfo/1.8.3 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.7.12

File hashes

Hashes for pymdfs-0.1.tar.gz
Algorithm Hash digest
SHA256 d61b087c24e2049e7711da9d15988caedb5d9ccb4ac8ad4dd436ca2866f41fbc
MD5 ce988ef28ea80a9b79a296a1bf7694cc
BLAKE2b-256 81051bfaaff4f1a139cf2bb82bff75d05d111fbda023af0795c7640216df9ead

See more details on using hashes here.

File details

Details for the file pymdfs-0.1-py3-none-any.whl.

File metadata

  • Download URL: pymdfs-0.1-py3-none-any.whl
  • Upload date:
  • Size: 69.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.7.1 importlib_metadata/4.11.3 pkginfo/1.8.3 requests/2.26.0 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.7.12

File hashes

Hashes for pymdfs-0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 ab44a736ccf58bce39633cc50514209b6342036e43304b75bf338afbea6ebbe9
MD5 2f7c63694f42af4f09664fae3c5c83cb
BLAKE2b-256 fe44851640b90e0d9ba2c7a37dd11cc43cce6cef92a3de7078dc5410c9b296ea

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page