Skip to main content

A JSON schema-based validator for TokaMap experimental data mapping configurations

Project description

TokaMap Validator

A Python validator tool for TokaMap experimental data mapping configurations. This package validates TokaMap mapping files against JSON schemas to ensure data consistency and correctness.

Overview

TokaMap is a JSON schema-based framework for mapping and structuring experimental data. The TokaMap Validator ensures that your mapping configurations conform to the TokaMap schema specifications.

Features

  • ✅ Validates TokaMap configuration files (mappings.cfg.json)
  • ✅ Validates global settings files (globals.json)
  • ✅ Validates mapping files (mappings.json)
  • ✅ Supports partitioned mapping structures
  • ✅ Provides detailed error messages for schema violations
  • ✅ Command-line interface with verbose output option

Installation

From PyPI

pip install tokamap-validator

From Source

git clone https://github.com/ukaea/tokamap.git
cd tokamap/tokamap_validator
pip install .

Requirements

  • Python >= 3.13
  • jsonschema >= 4.25.0

Usage

Command Line

Validate a TokaMap mapping directory:

tokamap-validator /path/to/mapping/directory

For verbose output with detailed validation progress:

tokamap-validator -v /path/to/mapping/directory

Check the installed version:

tokamap-validator --version

Python API

You can also use the validator programmatically:

from tokamap_validator.validate import Validator

# Create a validator with a schema file
validator = Validator('/path/to/schema.json')

# Validate a JSON file
validator.validate('/path/to/file.json')

TokaMap Directory Structure

A typical TokaMap project follows this structure:

mapping_root/
├── mappings.cfg.json          # Main configuration file
├── globals.json               # Global settings
├── mapping_group_1/
│   ├── globals.json          # Group-specific globals
│   └── mappings.json         # Group mappings
└── mapping_group_2/
    ├── globals.json
    └── mappings.json

Validation Process

The validator performs the following checks:

  1. Configuration Validation: Verifies that mappings.cfg.json exists and conforms to the configuration schema
  2. Top-level Globals Validation: Validates the root-level globals.json file
  3. Group Validation: For each mapping group defined in the configuration:
    • Validates the group's globals.json file
    • Validates the group's mappings.json file
  4. Partition Support: Handles partitioned mapping structures recursively

Mapping Types

TokaMap supports five mapping types:

  • DIMENSION: Maps dimensional probe data
  • VALUE: Maps static values (numbers, strings, arrays, objects)
  • DATA_SOURCE: Maps external data sources with configurable parameters
  • EXPR: Maps mathematical expressions with parameters
  • CUSTOM: Maps custom functions from external libraries

Example Configuration

mappings.cfg.json

{
  "metadata": {
    "experiment": "my_experiment",
    "author": "Your Name",
    "version": "1.0.0"
  },
  "partitions": [
    {
      "attribute": "time",
      "selector": "closest"
    }
  ],
  "groups": ["diagnostics", "analysis"]
}

globals.json

{
  "DATA_SOURCE_CONFIG": {
    "my_data_source": {
      "ARGS": {
        "connection_string": "tcp://localhost:8080"
      }
    }
  }
}

mappings.json

{
  "temperature": {
    "MAP_TYPE": "DATA_SOURCE",
    "DATA_SOURCE": "my_data_source",
    "ARGS": {
      "signal": "TEMP_01"
    },
    "SCALE": 1.0,
    "COMMENT": "Temperature measurement from sensor 01"
  },
  "calculated_value": {
    "MAP_TYPE": "EXPR",
    "EXPR": "a * b + c",
    "PARAMETERS": {
      "a": 2.5,
      "b": "temperature",
      "c": 10
    }
  }
}

Error Messages

The validator provides clear error messages when validation fails:

Validation error in file: /path/to/mappings.json
'MAP_TYPE' is a required property

Testing

Verify Dynamic Versioning

Run the comprehensive test suite to verify dynamic versioning is working correctly:

python test_dynamic_versioning.py

This tests that:

  • Version is read from pyproject.toml
  • Package metadata is correct
  • __version__ variable is set correctly
  • CLI --version displays the correct version
  • Development mode fallback works

All tests should pass if the package is installed correctly.

Contributing

Contributions are welcome! Please visit the GitHub repository to:

  • Report bugs
  • Request features
  • Submit pull requests

License

This project is licensed under the MIT License - see the LICENSE file for details.

Links

Authors

Acknowledgments

Developed by UKAEA to support experimental data mapping workflows.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tokamap-0.1.0.tar.gz (7.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

tokamap-0.1.0-py3-none-any.whl (8.5 kB view details)

Uploaded Python 3

File details

Details for the file tokamap-0.1.0.tar.gz.

File metadata

  • Download URL: tokamap-0.1.0.tar.gz
  • Upload date:
  • Size: 7.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for tokamap-0.1.0.tar.gz
Algorithm Hash digest
SHA256 80fd3bf8dd54e3f7a4b5c87ed976824a270b8ce1056f87938923c138e4098347
MD5 2b860a6d8cf313fcaa24ce06e59e8fc4
BLAKE2b-256 57aadaccca5df0397e32d04f940f6f6817b38e60caa942a5d81b2a592ccda59b

See more details on using hashes here.

File details

Details for the file tokamap-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: tokamap-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 8.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for tokamap-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 184aa077e66260b406c60f2d001f79a3f2c663341b8bad91986da49b2d15ac51
MD5 6854a44436aa2e93243016cee885dd5b
BLAKE2b-256 386112170ee5251260f1fd562cf694bf9063f7f1c6ae5b2a31f72c7f233e1de7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page