Skip to main content

Python wrapper for the Helios Software SOF (SQL on FHIR) toolkit.

Project description

pysof - SQL on FHIR for Python

PyPI version Python versions License: MIT Downloads

High-performance FHIR data transformation for Python. Transform FHIR resources into tabular formats (CSV, JSON, Parquet) using declarative ViewDefinitions from the SQL on FHIR specification.

Built in Rust for speed, exposed to Python with a simple, Pythonic API. Part of the Helios FHIR Server project.

โœจ Key Features

  • ๐Ÿš€ High Performance: Native Rust implementation with minimal Python overhead
  • ๐Ÿ“Š Multiple Output Formats: CSV, JSON, NDJSON, and Parquet
  • ๐Ÿ”„ Parallel Processing: Automatic multithreading with 5-7x speedup on multi-core systems
  • ๐Ÿ“ฆ Streaming Support: Memory-efficient chunked processing for large NDJSON files
  • ๐ŸŒ Multi-Version FHIR: Supports R4, R4B, R5, and R6 (based on build features)
  • ๐ŸŽฏ Type-Safe: Leverages Rust's type safety with a Pythonic interface
  • โšก GIL-Free: Python GIL released during processing for true parallelism

๐ŸŽฏ Why pysof?

Working with FHIR data in Python just got faster. pysof lets you:

  • Transform complex FHIR resources into clean, analyzable tables without writing custom parsers
  • Process large datasets efficiently with automatic parallel processing and Rust-level performance
  • Use standard SQL on FHIR ViewDefinitions for portable, maintainable data transformations
  • Export to multiple formats (CSV, JSON, NDJSON, Parquet) for analytics, ML, or reporting workflows

Perfect for healthcare data engineers, researchers, and developers building FHIR-based analytics pipelines.

๐Ÿ”— Quick Links

๐Ÿ“ฅ Installation

From PyPI (Recommended)

pip install pysof

Supported Platforms:

  • Linux: x86_64 (glibc and musl)
  • Windows: x86_64 (MSVC)
  • macOS: AArch64 (Apple Silicon)
  • Python: 3.10, 3.11, 3.12, 3.13, 3.14

From GitHub Releases

Download pre-built wheels from the releases page:

pip install pysof-*.whl

๐Ÿš€ Quick Start

Transform FHIR patient data to CSV in just a few lines:

import pysof

# Define what data to extract
view_definition = {
    "resourceType": "ViewDefinition",
    "id": "patient-demographics",
    "name": "PatientDemographics",
    "status": "active",
    "resource": "Patient",
    "select": [{
        "column": [
            {"name": "id", "path": "id"},
            {"name": "family_name", "path": "name.family"},
            {"name": "given_name", "path": "name.given.first()"},
            {"name": "gender", "path": "gender"},
            {"name": "birth_date", "path": "birthDate"}
        ]
    }]
}

# Sample FHIR Bundle
bundle = {
    "resourceType": "Bundle",
    "type": "collection",
    "entry": [{
        "resource": {
            "resourceType": "Patient",
            "id": "patient-1",
            "name": [{"family": "Doe", "given": ["John"]}],
            "gender": "male",
            "birthDate": "1990-01-01"
        }
    }]
}

# Transform to CSV
csv_output = pysof.run_view_definition(view_definition, bundle, "csv")
print(csv_output.decode('utf-8'))
# Output:
# id,family_name,given_name,gender,birth_date
# patient-1,Doe,John,male,1990-01-01

๐Ÿ“– Usage

Multiple Output Formats

import pysof
import json

# Transform to different formats
csv_result = pysof.run_view_definition(view_definition, bundle, "csv")
json_result = pysof.run_view_definition(view_definition, bundle, "json")
ndjson_result = pysof.run_view_definition(view_definition, bundle, "ndjson")
parquet_result = pysof.run_view_definition(view_definition, bundle, "parquet")

print("CSV Output:")
print(csv_result.decode('utf-8'))

print("\nJSON Output:")
data = json.loads(json_result.decode('utf-8'))
print(json.dumps(data, indent=2))

Advanced Options

import pysof

# Transform with pagination and filtering
result = pysof.run_view_definition_with_options(
    view_definition,
    bundle,
    "json",
    limit=10,                          # Limit results
    page=1,                            # Page number
    since="2023-01-01T00:00:00Z",     # Filter by modification date
    fhir_version="R4"                  # Specify FHIR version
)

Utility Functions

import pysof

# Validate structures
is_valid_view = pysof.validate_view_definition(view_definition)
is_valid_bundle = pysof.validate_bundle(bundle)

# Parse content types
format_str = pysof.parse_content_type("text/csv")  # Returns "csv_with_header"

# Check supported FHIR versions
versions = pysof.get_supported_fhir_versions()  # Returns ["R4"] or more
print(f"Supported FHIR versions: {versions}")

# Package info
print(f"Version: {pysof.get_version()}")
print(pysof.get_status())

Streaming Large NDJSON Files

For memory-efficient processing of large NDJSON files, use the ChunkedProcessor iterator or process_ndjson_to_file function:

import pysof

view_definition = {
    "resourceType": "ViewDefinition",
    "status": "active",
    "resource": "Patient",
    "select": [{"column": [
        {"name": "id", "path": "id"},
        {"name": "gender", "path": "gender"}
    ]}]
}

# Iterator approach - process chunks incrementally
for chunk in pysof.ChunkedProcessor(view_definition, "patients.ndjson", chunk_size=500):
    print(f"Chunk {chunk['chunk_index']}: {len(chunk['rows'])} rows")
    for row in chunk["rows"]:
        process_row(row)
    if chunk["is_last"]:
        print("Processing complete!")

# Access column names before iterating
processor = pysof.ChunkedProcessor(view_definition, "patients.ndjson")
print(f"Columns: {processor.columns}")
for chunk in processor:
    # Process chunks...
    pass

# File-to-file approach - most memory efficient
stats = pysof.process_ndjson_to_file(
    view_definition,
    "input.ndjson",
    "output.csv",
    "csv",  # or "csv_with_header", "ndjson"
    chunk_size=1000,
    skip_invalid=True,  # Continue past invalid JSON lines
    fhir_version="R4"
)
print(f"Processed {stats['resources_processed']} resources")
print(f"Output {stats['output_rows']} rows in {stats['chunks_processed']} chunks")
print(f"Skipped {stats['skipped_lines']} invalid lines")

When to use streaming:

  • Processing NDJSON files larger than available memory
  • Working with datasets of 100K+ resources
  • Building ETL pipelines that process data incrementally
  • When you need fault-tolerant processing (skip invalid lines)

Error Handling

import pysof

try:
    result = pysof.run_view_definition(view_definition, bundle, "json")
except pysof.InvalidViewDefinitionError as e:
    print(f"ViewDefinition validation error: {e}")
except pysof.SerializationError as e:
    print(f"JSON parsing error: {e}")
except pysof.UnsupportedContentTypeError as e:
    print(f"Unsupported format: {e}")
except pysof.SofError as e:
    print(f"General SOF error: {e}")

โšก Performance

Automatic Parallel Processing

pysof automatically processes FHIR resources in parallel using rayon:

  • 5-7x speedup on typical batch workloads with multi-core CPUs
  • Streaming benefits: ChunkedProcessor and process_ndjson_to_file also use parallel processing
  • Zero configuration - parallelization is always enabled
  • Python GIL released during processing for true parallel execution

Performance Benchmarks

Mode Dataset Time Memory Notes
Batch 10k Patients ~2.7s 1.6 GB All resources in memory
Streaming 10k Patients ~0.9s 45 MB 35x less memory, 2.9x faster
Batch 93k Encounters ~4s 3.9 GB All resources in memory
Streaming 93k Encounters ~2.8s 25 MB 155x less memory, 1.4x faster

Streaming mode (ChunkedProcessor, process_ndjson_to_file) is recommended for large NDJSON files.

Controlling Thread Count (RAYON_NUM_THREADS)

Set the RAYON_NUM_THREADS environment variable to control parallel processing:

import os
os.environ['RAYON_NUM_THREADS'] = '4'  # Must be set before first import

import pysof
result = pysof.run_view_definition(view_definition, bundle, "json")

Or from the command line:

# Linux/Mac
RAYON_NUM_THREADS=4 python my_script.py

# Windows PowerShell
$env:RAYON_NUM_THREADS=4
python my_script.py

When to adjust thread count:

  • Reduce threads (RAYON_NUM_THREADS=2-4): On shared systems, containers with CPU limits, or when running multiple instances
  • Increase threads: Rarely needed; rayon auto-detects available cores
  • Single thread (RAYON_NUM_THREADS=1): For debugging or deterministic output ordering

Performance Tips:

  • Use all available cores for large datasets (default behavior)
  • Limit threads on shared systems to avoid resource contention
  • Prefer streaming mode (ChunkedProcessor) for NDJSON files > 100MB

๐Ÿ“‹ Supported Features

Output Formats

Format Description Output
csv CSV with headers Comma-separated values with header row
json JSON array Array of objects, one per result row
ndjson Newline-delimited JSON One JSON object per line
parquet Parquet format Columnar binary format for analytics

FHIR Versions

  • R4 (default, always available)
  • R4B (if compiled with R4B feature)
  • R5 (if compiled with R5 feature)
  • R6 (if compiled with R6 feature)

Use pysof.get_supported_fhir_versions() to check available versions in your build.


๐Ÿ”ง Development

Requirements

  • Python 3.10 or later (3.10, 3.11, 3.12, 3.13, 3.14 supported)
  • uv (package and environment manager)
  • Rust toolchain (for building from source)

Note: This crate is excluded from the default workspace build. When running cargo build from the repository root, pysof will not be built automatically.

Building from Source

Building with Cargo

This crate is excluded from the default workspace build to allow building the core Rust components without Python. To build it explicitly:

# Your current directory MUST be the pysof crate:
cd crates/pysof

# From the pysof folder
cargo build

# Or build with specific FHIR version features
cargo build -p pysof --features R4,R5

Building with Maturin (Recommended)

For Python development, it's recommended to use maturin via uv:

# From repo root
cd crates/pysof

# Create a venv with your preferred Python version (3.10+)
uv venv --python 3.11  # or 3.10, 3.12, 3.13, 3.14

# Install the project dev dependencies
uv sync --group dev

# Build and install the Rust extension into the venv
uv run maturin develop --release

# Build distributable artifacts
uv run maturin build --release -o dist     # wheels
uv run maturin sdist -o dist               # source distribution

# Sanity checks
uv run python -c "import pysof; print(pysof.__version__); print(pysof.get_status()); print(pysof.get_supported_fhir_versions())"

Installing from Source

Requires Rust toolchain:

# Install directly
pip install -e .

# Or build wheel locally
maturin build --release --out dist
pip install dist/*.whl

Testing

The project has separate test suites for Python and Rust components:

Python Tests

Run the comprehensive Python test suite:

# Run all Python tests
uv run pytest python-tests/

# Run specific test files
uv run pytest python-tests/test_core_functions.py -v
uv run pytest python-tests/test_content_types.py -v
uv run pytest python-tests/test_import.py -v

# Run with coverage
uv run pytest python-tests/ --cov=pysof --cov-report=html

# Run tests with detailed output
uv run pytest python-tests/ -v --tb=short

Rust Tests

Run the Rust unit and integration tests:

# Run all Rust tests
cargo test

# Run unit tests only
cargo test --test lib_tests

# Run integration tests only
cargo test --test integration

# Run with verbose output
cargo test -- --nocapture

Configuring FHIR Version Support

By default, pysof is compiled with R4 support only. You can configure which FHIR versions are available by modifying the feature compilation settings.

Change Default FHIR Version

To change from R4 to another version (e.g., R5):

  1. Edit crates/pysof/Cargo.toml:

    [features]
    default = ["R5"]  # Changed from ["R4"]
    R4 = ["helios-sof/R4", "helios-fhir/R4"]
    R4B = ["helios-sof/R4B", "helios-fhir/R4B"]
    R5 = ["helios-sof/R5", "helios-fhir/R5"]
    R6 = ["helios-sof/R6", "helios-fhir/R6"]
    
  2. Rebuild the extension:

    cd crates/pysof
    uv run maturin develop --release
    
  3. Verify the change:

    uv run python -c "
    import pysof
    versions = pysof.get_supported_fhir_versions()
    print('Supported FHIR versions:', versions)
    "
    

    This should now show ['R5'] instead of ['R4'].

Enable Multiple FHIR Versions

To support multiple FHIR versions simultaneously:

  1. Edit crates/pysof/Cargo.toml:

    [features]
    default = ["R4", "R5"]  # Enable both R4 and R5
    # Or enable all versions:
    # default = ["R4", "R4B", "R5", "R6"]
    
  2. Rebuild and verify:

    uv run maturin develop --release
    uv run python -c "import pysof; print(pysof.get_supported_fhir_versions())"
    

    This should show ['R4', 'R5'] (or all enabled versions).

  3. Use specific versions in code:

    import pysof
    
    # Use R4 explicitly
    result_r4 = pysof.run_view_definition(view, bundle, "json", fhir_version="R4")
    
    # Use R5 explicitly  
    result_r5 = pysof.run_view_definition(view, bundle, "json", fhir_version="R5")
    

Build with Specific Features (Without Changing Default)

To temporarily build with different features without modifying Cargo.toml:

# Build with only R5
cargo build --features R5 --no-default-features

# Build with R4 and R6
cargo build --features R4,R6 --no-default-features

# With maturin
uv run --with maturin -- maturin develop --release --cargo-extra-args="--features R5 --no-default-features"

Testing After Version Changes

After changing FHIR version support, run the test suite to ensure compatibility:

# Run all tests
uv run pytest

# Run FHIR version-specific tests
uv run pytest tests/test_fhir_versions.py -v

# Test with your new default version
uv run python -c "
import pysof

# Test with default version (should be your new default)
view = {'resourceType': 'ViewDefinition', 'id': 'test', 'name': 'Test', 'status': 'active', 'resource': 'Patient', 'select': [{'column': [{'name': 'id', 'path': 'id'}]}]}
bundle = {'resourceType': 'Bundle', 'type': 'collection', 'entry': [{'resource': {'resourceType': 'Patient', 'id': 'test'}}]}

result = pysof.run_view_definition(view, bundle, 'json')
print('Default version test successful:', len(result), 'bytes')
"

Project layout

crates/pysof/
โ”œโ”€ pyproject.toml          # PEP 621 metadata, Python >=3.8, uv-compatible
โ”œโ”€ README.md
โ”œโ”€ src/
โ”‚  โ”œโ”€ pysof/
โ”‚  โ”‚  โ””โ”€ __init__.py       # Python package root
โ”‚  โ””โ”€ lib.rs               # Rust PyO3 bindings
โ”œโ”€ tests/                  # Rust tests (17 tests)
โ”‚  โ”œโ”€ lib_tests.rs         # Unit tests for core library functions
โ”‚  โ”œโ”€ integration.rs       # Integration tests for component interactions
โ”‚  โ””โ”€ integration/         # Organized integration test modules
โ”‚     โ”œโ”€ mod.rs
โ”‚     โ”œโ”€ content_types.rs
โ”‚     โ”œโ”€ error_handling.rs
โ”‚     โ””โ”€ fhir_versions.rs
โ”œโ”€ python-tests/           # Python test suite (58 tests)
โ”‚  โ”œโ”€ __init__.py
โ”‚  โ”œโ”€ test_core_functions.py
โ”‚  โ”œโ”€ test_content_types.py
โ”‚  โ”œโ”€ test_fhir_versions.py
โ”‚  โ”œโ”€ test_import.py
โ”‚  โ””โ”€ test_package_metadata.py
โ””โ”€ Cargo.toml              # Rust crate metadata

๐Ÿ“„ License

MIT License - See LICENSE.md for details.

Copyright (c) 2025 Helios Software

๐Ÿค Contributing

Contributions are welcome! Please see our Contributing Guidelines for details.

Reporting Issues

Development Setup

See the Development section above for instructions on setting up your development environment.

๐Ÿ™ Acknowledgments

Built with:

  • PyO3 - Rust bindings for Python
  • maturin - Build system for Rust Python extensions
  • helios-sof - Core SQL-on-FHIR implementation in Rust

Part of the Helios FHIR Server project.


Made with โค๏ธ by Helios Software

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pysof-0.1.47.tar.gz (25.0 MB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

pysof-0.1.47-cp314-cp314-win_amd64.whl (36.9 MB view details)

Uploaded CPython 3.14Windows x86-64

pysof-0.1.47-cp314-cp314-manylinux_2_34_x86_64.whl (34.1 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.34+ x86-64

pysof-0.1.47-cp314-cp314-manylinux_2_28_aarch64.whl (32.7 MB view details)

Uploaded CPython 3.14manylinux: glibc 2.28+ ARM64

pysof-0.1.47-cp314-cp314-macosx_11_0_arm64.whl (27.9 MB view details)

Uploaded CPython 3.14macOS 11.0+ ARM64

pysof-0.1.47-cp313-cp313-win_amd64.whl (36.9 MB view details)

Uploaded CPython 3.13Windows x86-64

pysof-0.1.47-cp313-cp313-manylinux_2_34_x86_64.whl (34.1 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.34+ x86-64

pysof-0.1.47-cp313-cp313-manylinux_2_28_aarch64.whl (32.7 MB view details)

Uploaded CPython 3.13manylinux: glibc 2.28+ ARM64

pysof-0.1.47-cp313-cp313-macosx_11_0_arm64.whl (27.9 MB view details)

Uploaded CPython 3.13macOS 11.0+ ARM64

pysof-0.1.47-cp312-cp312-win_amd64.whl (36.9 MB view details)

Uploaded CPython 3.12Windows x86-64

pysof-0.1.47-cp312-cp312-manylinux_2_34_x86_64.whl (34.1 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.34+ x86-64

pysof-0.1.47-cp312-cp312-manylinux_2_28_aarch64.whl (32.7 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.28+ ARM64

pysof-0.1.47-cp312-cp312-macosx_11_0_arm64.whl (27.9 MB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

pysof-0.1.47-cp311-cp311-win_amd64.whl (36.9 MB view details)

Uploaded CPython 3.11Windows x86-64

pysof-0.1.47-cp311-cp311-manylinux_2_34_x86_64.whl (34.1 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.34+ x86-64

pysof-0.1.47-cp311-cp311-manylinux_2_28_aarch64.whl (32.7 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.28+ ARM64

pysof-0.1.47-cp311-cp311-macosx_11_0_arm64.whl (27.8 MB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

pysof-0.1.47-cp310-cp310-win_amd64.whl (36.9 MB view details)

Uploaded CPython 3.10Windows x86-64

pysof-0.1.47-cp310-cp310-manylinux_2_34_x86_64.whl (34.1 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.34+ x86-64

pysof-0.1.47-cp310-cp310-manylinux_2_28_aarch64.whl (32.7 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.28+ ARM64

pysof-0.1.47-cp310-cp310-macosx_11_0_arm64.whl (27.9 MB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

File details

Details for the file pysof-0.1.47.tar.gz.

File metadata

  • Download URL: pysof-0.1.47.tar.gz
  • Upload date:
  • Size: 25.0 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.3

File hashes

Hashes for pysof-0.1.47.tar.gz
Algorithm Hash digest
SHA256 08e32b6102f3a41e3609ba8aed2e38310d225cb60453867c2c34d87943980799
MD5 30fe07b02367a206dd2099bfc9284d41
BLAKE2b-256 6489db8db8e555548c2145294c7d68087dd56e6ac6287fcae9e20b0272e83fa8

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp314-cp314-win_amd64.whl.

File metadata

  • Download URL: pysof-0.1.47-cp314-cp314-win_amd64.whl
  • Upload date:
  • Size: 36.9 MB
  • Tags: CPython 3.14, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.3

File hashes

Hashes for pysof-0.1.47-cp314-cp314-win_amd64.whl
Algorithm Hash digest
SHA256 699fa5fda09da0d9347b9871d666cc1d2c78c3dba92ac9d5a5ef08a307b4737e
MD5 f19c1bf972193c0df95edc830375365b
BLAKE2b-256 07e8616a3f5eb5135828ef5354e0233bca4b3a030118a81e098c5ed4349cb492

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp314-cp314-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp314-cp314-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 0939b25c98d96a45ab4c9c046fc9690b32968337558fda634653f326a2866b37
MD5 261e9e208385d1c7dedb82cdfc79ea06
BLAKE2b-256 ee149fd765eea45e59bdea4b74d68ff621d62a751dc4195b7c59c0782982d149

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp314-cp314-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp314-cp314-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 b2c8149ffc516831197a921c4f2da9f0a021b1f84bfb5c3a8f32f39b5d14a0ac
MD5 92eff7ebf4c4ea6337138269591584c2
BLAKE2b-256 3c3bbf8385823cb9e482ac18e298e96b584be97943c4be4bb01ab80a265b9c26

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp314-cp314-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp314-cp314-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 b86609fd5b98ab607cc2a69207ba80e928907d6e6bfcb6d9f38e8977b6bd44b4
MD5 5ad7802dce37af447d8d074675ab53c9
BLAKE2b-256 6182cc293b0182fa0b9fbd2993466c56d26ce58d07095ec913fb9d19f712fa77

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp313-cp313-win_amd64.whl.

File metadata

  • Download URL: pysof-0.1.47-cp313-cp313-win_amd64.whl
  • Upload date:
  • Size: 36.9 MB
  • Tags: CPython 3.13, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.3

File hashes

Hashes for pysof-0.1.47-cp313-cp313-win_amd64.whl
Algorithm Hash digest
SHA256 7afc83490cab16c15b2bd1ed12805bf7e0c8d220f737e7090554812778e345c3
MD5 72b0912e932b15d40288b88078c42281
BLAKE2b-256 80ba39d4083408433ea1cc28d46a7868af79fc27c9be614e7bfa878348b4dc55

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp313-cp313-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp313-cp313-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 11ee4a32f296f4e49cf5f771c423641a2aaeee443a7bbd5d40fc32d59b8a36c1
MD5 29d7726076852c9e8571c5759f627f11
BLAKE2b-256 de2b61386433c1ac501f906b1e2bbc600e454c0617f5a1d480abda0830e619da

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp313-cp313-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp313-cp313-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 e1d6cc5294423140d723d4faec80ff8304ffa2ba70ba793f8c0a6ed64262295c
MD5 55af4c7ffb124aafb0b2de2d85201db1
BLAKE2b-256 8811203989418c62c8cbcdd5f974b761a46ef6dd350c5e97a3d29a07e107ceed

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp313-cp313-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp313-cp313-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 3dc5de50669c1a3059ffb6948415b2c093d534ed64e39cb569036135c19157e9
MD5 5402e81f85a48534047ddbd2c3cb16f2
BLAKE2b-256 9649ad015e811875714dd19812b3304e9e2fdbde95531a526172e5f93e487e97

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp312-cp312-win_amd64.whl.

File metadata

  • Download URL: pysof-0.1.47-cp312-cp312-win_amd64.whl
  • Upload date:
  • Size: 36.9 MB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.3

File hashes

Hashes for pysof-0.1.47-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 518471ac91d7bb3ecd8515a4a2263158df1859fda5a3f4dab89808aa7cde32b5
MD5 6f979be4f55915e5d4d675d446eba3d3
BLAKE2b-256 161a1dfc3e020a841ac99b7a0a86176ddee5033e1841e05f35f1a93e0573c608

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp312-cp312-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp312-cp312-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 8291929767244abd4d61af65f67cc84b3c828c2c2bf82051b199e1170466be2b
MD5 2fb534efb90f3392c7b7e5466981638c
BLAKE2b-256 9fcbea3a15e080b18bf68f6b22ff146ca03fa66db5173ffff9780df64648d240

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp312-cp312-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp312-cp312-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 cd24836de61ddfbcb3f69cd4390d6061aab48c63144a6e9793ebffc26d9766b1
MD5 6a5976dc44170e6af0a0fb19d0cb29b5
BLAKE2b-256 1589fe81f1e7bb27aa7154008c8f93ccc5842e9882d83eb85d24dcec4171c18d

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 7e2bce1d279aa95c2b8884bbaa483b6b5c228db55e0d9880cca0a88151be85d4
MD5 e4ff563f700b0b69ca58d02da3d0749b
BLAKE2b-256 8d35e7811c2eb083252a3cb6049afb257e71425bbc5fe2d82a5bd82a41417fbe

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: pysof-0.1.47-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 36.9 MB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.3

File hashes

Hashes for pysof-0.1.47-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 aca7ac80cf5ce8b53ff310b6ea6190a9914d3fe1e700320920d1d2e840533865
MD5 d268ebd0369bf1387fecec46d4c383c5
BLAKE2b-256 6342da7c51979e6f00ba3f1dea6b7c4865af1c6419a6efd4a476b9ac9290be06

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp311-cp311-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp311-cp311-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 f7e2ecc75f64b8136eb7095aa74832b04c5834662b13dda791dcc23aa95f79e7
MD5 948afb6b1e47f13818d8b55dc06ece3a
BLAKE2b-256 33e4ffaf480a581bf9cc2a42a5aad70c49d452914f483021398b9657f2331b39

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp311-cp311-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp311-cp311-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 daa2f76e6d1a8b6b883e3111150e252b0f04afbb1bbc5deb5c23e9f7306eb9b2
MD5 1beaa174e3a19f30a52f85a8b0e18d9c
BLAKE2b-256 c3ac3386acae2077c7483f7f6eebe6f135e2db920d2ee088efcb8e8098cd9c78

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 ea82581c19c5bb9ec62e3b29b891f95df0971b6bd05312411a77858ee234d49b
MD5 bce04c033fe6ed65c32a0059008c8e51
BLAKE2b-256 63b0a031ea0f88fbc11071fa93a09d847be9b0baad4e46e9333dcc4bc39eed5b

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: pysof-0.1.47-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 36.9 MB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.13.3

File hashes

Hashes for pysof-0.1.47-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 51df049f805c1058121b342123390799a0cc96f3ec5edbf224de1ad3db25e5a6
MD5 e6cacf291751ffaf2c8efd8a5af35e7f
BLAKE2b-256 60700a87497e1bb677676ca6371f73edd4aaf47776d33d2a1ad3216e914a2d94

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp310-cp310-manylinux_2_34_x86_64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp310-cp310-manylinux_2_34_x86_64.whl
Algorithm Hash digest
SHA256 4b0fae5dbc5264b32d125dd908af3be713297a31c022c7e2dadf5db64c7de043
MD5 e53e1ec5705285172d56fd4d7ca28ac1
BLAKE2b-256 7ed042fc2b44cb5cb0f51365ef593c0ce3c9b55b9f6e35bedbec6b919fab04b5

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp310-cp310-manylinux_2_28_aarch64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp310-cp310-manylinux_2_28_aarch64.whl
Algorithm Hash digest
SHA256 b3ce2a213055c18aef5efe3ac867823777ef250954e04d2a9cb4c641c7da3c17
MD5 975f79b5b14c5d66e0291af84e931fcc
BLAKE2b-256 5013e9b8b09fbb3968052a91b427beb722098130a7500274eb00aa90780b1134

See more details on using hashes here.

File details

Details for the file pysof-0.1.47-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pysof-0.1.47-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 be48cb8dfe715e6bee1b873fd1015948c9496b0bbfc2f0c43831b9969fc32b61
MD5 019dc5631f73f5d89f6569f45c4deb2f
BLAKE2b-256 ba604c07387273db01ad7e41af0edd5b387b5d95d1c3a015e3fdff9ddccab91c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page