Skip to main content

A Python reference implementation of the CF data model

Project description

The cfdm Python package is a complete reference implementation of the CF data model for CF-1.11, that identifies the fundamental elements of the CF conventions and shows how they relate to each other, independently of the netCDF encoding.

The central element defined by the CF data model is the field construct, which corresponds to a CF-netCDF data variable with all of its metadata.

A simple example of reading a field construct from a file and inspecting it:

>>> import cfdm
>>> f = cfdm.read('file.nc')
>>> f
[<Field: air_temperature(time(12), latitude(64), longitude(128)) K>]
>>> print(f[0])
Field: air_temperature (ncvar%tas)
----------------------------------
Data            : air_temperature(time(12), latitude(64), longitude(128)) K
Cell methods    : time(12): mean (interval: 1.0 month)
Dimension coords: time(12) = [0450-11-16 00:00:00, ..., 0451-10-16 12:00:00] noleap
                : latitude(64) = [-87.8638, ..., 87.8638] degrees_north
                : longitude(128) = [0.0, ..., 357.1875] degrees_east
                : height(1) = [2.0] m

The cfdm package can

  • read field and domain constructs from netCDF, CDL, and Zarr datasets with a choice of netCDF backends,

  • be fully flexible with respect to dataset storage chunking,

  • create new field and domain constructs in memory,

  • write and append field and domain constructs to netCDF and Zarr v3 datasets on disk,

  • read, write, and manipulate UGRID mesh topologies,

  • read, write, and create coordinates defined by geometry cells,

  • read and write netCDF4 string data-type variables,

  • read, write, and create netCDF and CDL datasets containing hierarchical groups,

  • inspect field and domain constructs,

  • test whether two constructs are the same,

  • modify field and domain construct metadata and data,

  • create subspaces of field and domain constructs,

  • incorporate, and create, metadata stored in external files, and

  • read, write, and create data that have been compressed by convention (i.e. ragged or gathered arrays, or coordinate arrays compressed by subsampling), whilst presenting a view of the data in its uncompressed form,

  • read and write that data that are quantized to eliminate false precision.

Documentation

https://ncas-cms.github.io/cfdm

Dask

From version 1.11.2.0 the cfdm package uses Dask for all of its data manipulations.

Tutorial

https://ncas-cms.github.io/cfdm/tutorial

Installation

https://ncas-cms.github.io/cfdm/installation

Command line utility

During installation the cfdump command line tool is also installed, which generates text descriptions of the field constructs contained in a netCDF dataset.

Source code

This project is hosted in a GitHub repository where you can access the most up-to-date source.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cfdm-1.13.0.0.tar.gz (666.9 kB view details)

Uploaded Source

File details

Details for the file cfdm-1.13.0.0.tar.gz.

File metadata

  • Download URL: cfdm-1.13.0.0.tar.gz
  • Upload date:
  • Size: 666.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.11

File hashes

Hashes for cfdm-1.13.0.0.tar.gz
Algorithm Hash digest
SHA256 ee33be390e12340f35292ec1c860a74ea19de8bc974eab91046c0c06275ed78d
MD5 5ce0836681d5e04832e352664afb8410
BLAKE2b-256 338d11778f383892d5fd7a4283f2dbd5df49ef30e20ecff35b32f1c412f6859e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page