A pure Rust graph executor supporting implicit node connections, branching, and config sweeps
Project description
graph-sp
graph-sp is a pure Rust grid/node graph executor and optimizer. The project focuses on representing directed dataflow graphs, computing port mappings by graph inspection, and executing nodes efficiently in-process with parallel CPU execution.
Core Features
- Implicit Node Connections: Nodes automatically connect based on execution order
- Parallel Branching: Create fan-out execution paths with
.branch() - Configuration Variants: Use
.variant()to create parameter sweeps - DAG Analysis: Automatic inspection and optimization of execution paths
- Mermaid Visualization: Generate diagrams with
.to_mermaid() - In-process Execution: Parallel execution using rayon
Installation
Rust
Add to your Cargo.toml:
[dependencies]
graph-sp = "0.1.0"
# Optional: For radar signal processing examples with ndarray and FFT support
[features]
radar_examples = ["graph-sp/radar_examples"]
For radar signal processing with ndarray and complex number support, enable the radar_examples feature.
Python
The library can also be used from Python via PyO3 bindings:
pip install pygraph-sp
Or build from source:
pip install maturin
maturin build --release --features python
pip install target/wheels/pygraph_sp-*.whl
Quick Start
Rust
Basic Sequential Pipeline
use graph_sp::{Graph, GraphData};
use std::collections::HashMap;
fn data_source(_: &HashMap<String, GraphData>, _: &HashMap<String, GraphData>) -> HashMap<String, GraphData> {
let mut result = HashMap::new();
result.insert("value".to_string(), GraphData::int(42));
result
}
fn multiply(inputs: &HashMap<String, GraphData>, _: &HashMap<String, GraphData>) -> HashMap<String, GraphData> {
let mut result = HashMap::new();
if let Some(val) = inputs.get("x").and_then(|d| d.as_int()) {
result.insert("doubled".to_string(), GraphData::int(val * 2));
}
result
}
fn main() {
let mut graph = Graph::new();
// Add source node
graph.add(data_source, Some("DataSource"), None, Some(vec![("value", "data")]));
// Add processing node
graph.add(multiply, Some("Multiply"), Some(vec![("data", "x")]), Some(vec![("doubled", "result")]));
let dag = graph.build();
let context = dag.execute(false, None);
println!("Result: {}", context.get("result").unwrap().to_string_repr());
}
Python
Basic Sequential Pipeline
import graph_sp
def data_source(inputs, variant_params):
return {"value": "42"}
def multiply(inputs, variant_params):
val = int(inputs.get("x", "0"))
return {"doubled": str(val * 2)}
# Create graph
graph = graph_sp.PyGraph()
# Add source node
graph.add(
function=data_source,
label="DataSource",
inputs=None,
outputs=[("value", "data")]
)
# Add processing node
graph.add(
function=multiply,
label="Multiply",
inputs=[("data", "x")],
outputs=[("doubled", "result")]
)
# Build and execute
dag = graph.build()
context = dag.execute()
print(f"Result: {context['result']}")
Mermaid visualization output:
graph TD
0["DataSource"]
1["Multiply"]
0 -->|data → x| 1
Parallel Branching (Fan-Out)
let mut graph = Graph::new();
// Source node
graph.add(source_fn, Some("Source"), None, Some(vec![("data", "data")]));
// Create parallel branches
graph.branch();
graph.add(stats_fn, Some("Statistics"), Some(vec![("data", "input")]), Some(vec![("mean", "stats")]));
graph.branch();
graph.add(model_fn, Some("MLModel"), Some(vec![("data", "input")]), Some(vec![("prediction", "model")]));
graph.branch();
graph.add(viz_fn, Some("Visualization"), Some(vec![("data", "input")]), Some(vec![("plot", "viz")]));
let dag = graph.build();
Mermaid visualization output:
graph TD
0["Source"]
1["Statistics"]
2["MLModel"]
3["Visualization"]
0 -->|data → input| 1
0 -->|data → input| 2
0 -->|data → input| 3
style 1 fill:#e1f5ff
style 2 fill:#e1f5ff
style 3 fill:#e1f5ff
DAG Statistics:
- Nodes: 4
- Depth: 2 levels
- Max Parallelism: 3 nodes (all branches execute in parallel)
Parameter Sweep with Variants
use graph_sp::{Graph, Linspace};
let mut graph = Graph::new();
// Source node
graph.add(source_fn, Some("DataSource"), None, Some(vec![("value", "data")]));
// Create variants for different learning rates
let learning_rates = vec![0.001, 0.01, 0.1, 1.0];
graph.variant("learning_rate", learning_rates);
graph.add(scale_fn, Some("ScaleLR"), Some(vec![("data", "input")]), Some(vec![("scaled", "output")]));
let dag = graph.build();
Mermaid visualization output:
graph TD
0["DataSource"]
1["ScaleLR (v0)"]
2["ScaleLR (v1)"]
3["ScaleLR (v2)"]
4["ScaleLR (v3)"]
0 -->|data → input| 1
0 -->|data → input| 2
0 -->|data → input| 3
0 -->|data → input| 4
style 1 fill:#e1f5ff
style 2 fill:#e1f5ff
style 3 fill:#e1f5ff
style 4 fill:#e1f5ff
style 1 fill:#ffe1e1
style 2 fill:#e1ffe1
style 3 fill:#ffe1ff
style 4 fill:#ffffe1
DAG Statistics:
- Nodes: 5
- Depth: 2 levels
- Max Parallelism: 4 nodes
- Variants: 4 (all execute in parallel)
Radar Signal Processing Example
This example demonstrates a complete radar signal processing pipeline using GraphData with ndarray arrays and complex numbers. The pipeline implements:
- LFM Pulse Generation - Creates a Linear Frequency Modulation chirp signal
- Pulse Stacking - Accumulates multiple pulses with Doppler shifts
- Range Compression - FFT-based matched filtering
- Doppler Compression - Creates Range-Doppler map
Rust Implementation
use graph_sp::{Graph, GraphData};
use ndarray::Array1;
use num_complex::Complex;
use std::collections::HashMap;
// LFM pulse generator node
fn lfm_generator(_inputs: &HashMap<String, GraphData>, params: &HashMap<String, GraphData>)
-> HashMap<String, GraphData> {
let num_samples = params.get("num_samples")
.and_then(|d| d.as_int())
.unwrap_or(256) as usize;
let bandwidth = params.get("bandwidth")
.and_then(|d| d.as_float())
.unwrap_or(100e6); // 100 MHz
let pulse_width = params.get("pulse_width")
.and_then(|d| d.as_float())
.unwrap_or(1e-6); // 1 microsecond
// Generate LFM chirp signal
let sample_rate = 100e6;
let chirp_rate = bandwidth / pulse_width;
let mut signal = Array1::<Complex<f64>>::zeros(num_samples);
// ... signal generation code ...
let mut output = HashMap::new();
output.insert("pulse".to_string(), GraphData::complex_array(signal));
output.insert("num_samples".to_string(), GraphData::int(num_samples as i64));
output
}
// Stack pulses node
fn stack_pulses(inputs: &HashMap<String, GraphData>, params: &HashMap<String, GraphData>)
-> HashMap<String, GraphData> {
let num_pulses = params.get("num_pulses")
.and_then(|d| d.as_int())
.unwrap_or(128) as usize;
// Get input pulse as ComplexArray
let pulse = inputs.get("pulse")
.and_then(|d| d.as_complex_array())
.unwrap().clone();
// Stack with Doppler shifts
// ... stacking logic ...
let mut output = HashMap::new();
output.insert("stacked".to_string(), GraphData::complex_array(stacked_data));
output.insert("num_pulses".to_string(), GraphData::int(num_pulses as i64));
output
}
fn main() {
let mut graph = Graph::new();
// Add LFM generator
graph.add(
lfm_generator,
Some("LFMGenerator"),
None,
Some(vec![("pulse", "lfm_pulse"), ("num_samples", "num_samples")])
);
// Add pulse stacking
graph.add(
stack_pulses,
Some("StackPulses"),
Some(vec![("lfm_pulse", "pulse")]),
Some(vec![("stacked", "stacked_data"), ("num_pulses", "num_pulses")])
);
// Add range compression
graph.add(
range_compress,
Some("RangeCompress"),
Some(vec![("stacked_data", "data"), ("lfm_pulse", "reference")]),
Some(vec![("compressed", "compressed_data")])
);
// Add Doppler compression
graph.add(
doppler_compress,
Some("DopplerCompress"),
Some(vec![
("compressed_data", "data"),
("num_pulses", "num_pulses"),
("num_samples", "num_samples")
]),
Some(vec![
("range_doppler", "range_doppler_map"),
("peak_value", "peak"),
("peak_doppler_bin", "peak_doppler"),
("peak_range_bin", "peak_range")
])
);
let dag = graph.build();
let context = dag.execute(false, None);
// Display results
if let Some(peak) = context.get("peak").and_then(|d| d.as_float()) {
println!("Peak magnitude: {:.2}", peak);
}
if let Some(doppler) = context.get("peak_doppler").and_then(|d| d.as_int()) {
println!("Peak Doppler bin: {}", doppler);
}
if let Some(range) = context.get("peak_range").and_then(|d| d.as_int()) {
println!("Peak Range bin: {}", range);
}
}
Run the example:
cargo run --example radar_demo --features radar_examples
Mermaid visualization output:
graph TD
0["LFMGenerator"]
1["StackPulses"]
2["RangeCompress"]
3["DopplerCompress"]
0 -->|lfm_pulse → pulse| 1
1 -->|stacked_data → data| 2
2 -->|compressed_data → data| 3
DAG Statistics:
- Nodes: 4
- Depth: 4 levels
- Max Parallelism: 1 node
Execution Output:
LFMGenerator: Generated 256 sample LFM pulse
StackPulses: Stacked 128 pulses with Doppler shifts
RangeCompress: Performed matched filtering on 32768 samples
DopplerCompress: Created Range-Doppler map of shape (128, 256)
Peak at Doppler bin 13, Range bin 255
Magnitude: 11974.31
Peak magnitude: 11974.31
Peak Doppler bin: 13
Peak Range bin: 255
Python Implementation
import graph_sp
import numpy as np
def lfm_generator(inputs, variant_params):
"""Generate LFM pulse with rectangular envelope."""
num_samples = 256
bandwidth = 100e6 # 100 MHz
pulse_width = 1e-6 # 1 microsecond
sample_rate = 100e6
# Generate LFM chirp
chirp_rate = bandwidth / pulse_width
signal = np.zeros(num_samples, dtype=complex)
# ... signal generation code ...
# Return numpy array directly (no conversion needed)
return {
"pulse": signal, # Can pass numpy arrays directly
"num_samples": num_samples
}
def stack_pulses(inputs, variant_params):
"""Stack multiple pulses with Doppler shifts."""
num_pulses = 128
# Get pulse data directly as complex array (implicit handling)
pulse_data = inputs.get("pulse", [])
pulse = np.array(pulse_data, dtype=complex)
# Stack with Doppler shifts
# ... stacking logic ...
# Return numpy array directly (no conversion needed)
return {
"stacked": stacked, # Can pass numpy arrays directly
"num_pulses": num_pulses
}
# Create graph
graph = graph_sp.PyGraph()
# Add nodes
graph.add(
function=lfm_generator,
label="LFMGenerator",
inputs=None,
outputs=[("pulse", "lfm_pulse"), ("num_samples", "num_samples")]
)
graph.add(
function=stack_pulses,
label="StackPulses",
inputs=[("lfm_pulse", "pulse")],
outputs=[("stacked", "stacked_data"), ("num_pulses", "num_pulses")]
)
graph.add(
function=range_compress,
label="RangeCompress",
inputs=[("stacked_data", "data"), ("lfm_pulse", "reference")],
outputs=[("compressed", "compressed_data")]
)
graph.add(
function=doppler_compress,
label="DopplerCompress",
inputs=[
("compressed_data", "data"),
("num_pulses", "num_pulses"),
("num_samples", "num_samples")
],
outputs=[
("range_doppler", "range_doppler_map"),
("peak_value", "peak"),
("peak_doppler_bin", "peak_doppler"),
("peak_range_bin", "peak_range")
]
)
# Build and execute
dag = graph.build()
context = dag.execute()
print(f"Peak magnitude: {context['peak']}")
print(f"Peak Doppler bin: {context['peak_doppler']}")
print(f"Peak Range bin: {context['peak_range']}")
Run the example:
python examples/python_radar_demo.py
Key Features Demonstrated
- Native Type Support: Uses
GraphData::complex_array()for signal data,GraphData::int()for metadata - No String Conversions: Numeric data stays in native format (i64, f64, Complex)
- Implicit Complex Number Handling: Python complex numbers (numpy.complex128, built-in complex) are automatically converted to/from GraphData::Complex without manual real/imag splitting
- Direct Numpy Array Support: Pass numpy ndarrays directly without
.tolist()conversion - automatic detection and conversion - Type Safety: Accessor methods (
.as_complex_array(),.as_int(),.as_float()) provide safe type extraction - Complex Signal Processing: Full FFT-based radar processing with ndarray integration
Adding Plotting Nodes
Plotting and visualization functions can be added as terminal nodes that take input but produce no output:
fn plot_range_doppler(inputs: &HashMap<String, GraphData>, _params: &HashMap<String, GraphData>)
-> HashMap<String, GraphData> {
// Extract data for plotting
if let Some(map) = inputs.get("range_doppler").and_then(|d| d.as_complex_array()) {
// Generate plot (save to file, display, etc.)
println!("Generating Range-Doppler map plot...");
// ... plotting code using matplotlib, plotters, etc. ...
}
// No outputs - this is a terminal/visualization node
HashMap::new()
}
// Add to graph
graph.add(
plot_range_doppler,
Some("PlotRangeDoppler"),
Some(vec![("range_doppler_map", "range_doppler")]),
None // No outputs for visualization nodes
);
This pattern allows visualization and logging nodes to be integrated into the pipeline without affecting data flow.
API Overview
Rust API
Graph Construction
Graph::new()- Create a new graphgraph.add(fn, name, inputs, outputs)- Add a nodefn: Node function with signaturefn(&HashMap<String, GraphData>, &HashMap<String, GraphData>) -> HashMap<String, GraphData>name: Optional node nameinputs: Optional vector of(broadcast_var, impl_var)tuples for input mappingsoutputs: Optional vector of(impl_var, broadcast_var)tuples for output mappings
graph.branch()- Create a new parallel branchgraph.variant(param_name, values)- Create parameter sweep variantsgraph.build()- Build the DAG
DAG Operations
dag.execute()- Execute the graph and return execution contextdag.stats()- Get DAG statistics (nodes, depth, parallelism, branches, variants)dag.to_mermaid()- Generate Mermaid diagram representation
Python API
The Python bindings provide a similar API with proper GIL handling:
Graph Construction
PyGraph()- Create a new graphgraph.add(function, label, inputs, outputs)- Add a nodefunction: Python callable with signaturefn(inputs: dict, variant_params: dict) -> dictlabel: Optional node name (str)inputs: Optional list of(broadcast_var, impl_var)tuples or dictoutputs: Optional list of(impl_var, broadcast_var)tuples or dict
graph.branch(subgraph)- Create a new parallel branch with a subgraphgraph.build()- Build the DAG and return a PyDag
DAG Operations
dag.execute()- Execute the graph and return execution context (dict)dag.execute_parallel()- Execute with parallel execution where possible (dict)dag.to_mermaid()- Generate Mermaid diagram representation (str)
GIL Handling
The Python bindings are designed with proper GIL handling:
- GIL Release: The Rust executor runs without holding the GIL, allowing true parallelism
- GIL Acquisition: Python callables used as node functions acquire the GIL only during their execution
- Thread Safety: The bindings use
pyo3::prepare_freethreaded_python()(via auto-initialize) for multi-threaded safety
This means that while Python functions execute sequentially (due to the GIL), the Rust graph traversal and coordination happens in parallel without GIL contention.
Development
Rust Development
Prerequisites:
- Rust (stable toolchain) installed: https://www.rust-lang.org/tools/install
Build and run tests:
cargo build --release
cargo test
Run examples:
cargo run --example comprehensive_demo
cargo run --example parallel_execution_demo
cargo run --example variant_demo_full
cargo run --example radar_demo --features radar_examples
Python Development
Prerequisites:
- Python 3.8+ installed
- Rust toolchain installed
Build Python bindings:
# Create virtual environment
python -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install maturin
pip install maturin==1.2.0
# Build and install in development mode
maturin develop --release --features python
# Run Python example
python examples/python_demo.py
Build wheel for distribution:
maturin build --release --features python
# Wheel will be in target/wheels/
Publishing
This repository is configured with GitHub Actions workflows to automatically publish to crates.io and PyPI when a release tag is pushed.
Required Repository Secrets
To enable automatic publishing, the repository owner must configure the following secrets in GitHub Settings → Secrets and variables → Actions:
CRATES_IO_TOKEN: Your crates.io API token (obtain from https://crates.io/me)PYPI_API_TOKEN: Your PyPI API token (obtain from https://pypi.org/manage/account/token/)
Publishing Process
The publish workflow (.github/workflows/publish.yml) will automatically run when:
- A tag matching
v*is pushed (e.g.,v0.1.0,v1.0.0) - The workflow is manually triggered via workflow_dispatch
Creating a release:
# Ensure version numbers in Cargo.toml and pyproject.toml are correct
git tag -a v0.1.0 -m "Release v0.1.0"
git push origin v0.1.0
The workflow will:
- Build Python wheels for Python 3.8-3.11 on Linux, macOS, and Windows
- Upload wheel artifacts to the GitHub Actions run (always, even without secrets)
- Publish to PyPI (only if
PYPI_API_TOKENis set) - prebuilt wheels mean end users do not need Rust - Publish to crates.io (only if
CRATES_IO_TOKENis set)
Important notes:
- Installing from PyPI with
pip install pygraph-spwill not require Rust on the target machine because prebuilt platform-specific wheels are published - Both crates.io and PyPI will reject duplicate version numbers - update versions before tagging
- The workflow will continue even if tokens are not set, allowing you to download artifacts for manual publishing
- For local testing, you can build wheels with
maturin build --release --features python
Manual Publishing
If you prefer to publish manually or need to publish from a local machine:
To crates.io:
cargo publish --token YOUR_CRATES_IO_TOKEN
To PyPI:
# Install maturin
pip install maturin==1.2.0
# Build and publish wheels
maturin publish --username __token__ --password YOUR_PYPI_API_TOKEN --features python
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pygraph_sp-2026.3.tar.gz.
File metadata
- Download URL: pygraph_sp-2026.3.tar.gz
- Upload date:
- Size: 798.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2234a445fe47fdba8bcb5551e81b0174ad6f7697941386f90cf56772ad753632
|
|
| MD5 |
5152a9aeb45ea4c58ccd73d4213bf583
|
|
| BLAKE2b-256 |
de36e64606dca854a8976261b97b120048e9ac883bed033ae59965aff129f1df
|
File details
Details for the file pygraph_sp-2026.3-cp312-none-win_amd64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp312-none-win_amd64.whl
- Upload date:
- Size: 179.5 kB
- Tags: CPython 3.12, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
4c1155dbbcb38f4f0e3dd22b5e6f84b7b3ec2c508e5c9277aaff5891ae45ac57
|
|
| MD5 |
1b4e8e4eadcff0612ec9a4111d016f78
|
|
| BLAKE2b-256 |
e41ee7ec310183d941891b07e4013a6db159fdacec9928b0953561da52f3b5fa
|
File details
Details for the file pygraph_sp-2026.3-cp312-cp312-manylinux_2_34_x86_64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp312-cp312-manylinux_2_34_x86_64.whl
- Upload date:
- Size: 337.6 kB
- Tags: CPython 3.12, manylinux: glibc 2.34+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e425aa6c1241e984bebdf01842abe328b195368cf30454fa95b8af6bf565b75f
|
|
| MD5 |
30df28522a8a301722afe16e882f64f6
|
|
| BLAKE2b-256 |
1e9e1ce5e88558479611768f30dcccb8806b831088673de73c55634391bd9ab7
|
File details
Details for the file pygraph_sp-2026.3-cp312-cp312-macosx_11_0_arm64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp312-cp312-macosx_11_0_arm64.whl
- Upload date:
- Size: 245.1 kB
- Tags: CPython 3.12, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0456dc1d98555c0cd9ac5563e32077a68c56bda9201dbbf0d8c997766853673d
|
|
| MD5 |
3cb25aee1042d021c7a764ed8c67c15e
|
|
| BLAKE2b-256 |
2db3608ede790521a38ca3fd22733049151abeda89cee690339ebc716b05ec71
|
File details
Details for the file pygraph_sp-2026.3-cp311-none-win_amd64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp311-none-win_amd64.whl
- Upload date:
- Size: 179.6 kB
- Tags: CPython 3.11, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1efa85f99bdd2016f36591d02c13a34a5557ebedc85e59c77d5051567e40e33f
|
|
| MD5 |
08c90c440a2bdb82c19e9701f8ad26cb
|
|
| BLAKE2b-256 |
b2668a02e5fd40e79ff7cfb6731de25d17c0896412119ba65349b73721b5dee3
|
File details
Details for the file pygraph_sp-2026.3-cp311-cp311-manylinux_2_34_x86_64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp311-cp311-manylinux_2_34_x86_64.whl
- Upload date:
- Size: 337.7 kB
- Tags: CPython 3.11, manylinux: glibc 2.34+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
96044f8be0172b937c95c548ec90cf6a0aa98eb59888485813503448313297e4
|
|
| MD5 |
fbcaca6a8db9d4466ea5e8120a2255b3
|
|
| BLAKE2b-256 |
d470bbbddca38780ef174a8c891138c8db23ad1730fd14db1a513b196ffbbaf6
|
File details
Details for the file pygraph_sp-2026.3-cp311-cp311-macosx_11_0_arm64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp311-cp311-macosx_11_0_arm64.whl
- Upload date:
- Size: 244.7 kB
- Tags: CPython 3.11, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5ad08a9e6ce16a6920ef0a3bc1db03cb55e60f48b8f6368513c13f07e70fc4ee
|
|
| MD5 |
1083611df54592b1ecdb1ee52cd5b390
|
|
| BLAKE2b-256 |
4991118acb7e1f7759b33f59317c783674988238e5b70d1709050797c2ae5a0c
|
File details
Details for the file pygraph_sp-2026.3-cp310-none-win_amd64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp310-none-win_amd64.whl
- Upload date:
- Size: 179.6 kB
- Tags: CPython 3.10, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
85e3f69654c7766462d5542488f1fd57fca14fdb8a2e426a4bdd3b12fbb570d0
|
|
| MD5 |
9f40c97774b2e251ada79fc9d896c833
|
|
| BLAKE2b-256 |
6eb22f1fdf95b5b0ca556f5fb4d1de766353936af0e4ae3db1dc722cec1dc47c
|
File details
Details for the file pygraph_sp-2026.3-cp310-cp310-manylinux_2_34_x86_64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp310-cp310-manylinux_2_34_x86_64.whl
- Upload date:
- Size: 337.7 kB
- Tags: CPython 3.10, manylinux: glibc 2.34+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
76204015e94dff35e952b6b4fde03937f0b2ab4017bce5e1f16d9ffb6ee1d5fb
|
|
| MD5 |
7abe85922b3ee3b35ddc8cc35def1760
|
|
| BLAKE2b-256 |
4e9d9e4f50d1ba0cabd7055d2e196beb4d42dc0d2cd86fb828ce50f7a6f69eb9
|
File details
Details for the file pygraph_sp-2026.3-cp310-cp310-macosx_11_0_arm64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp310-cp310-macosx_11_0_arm64.whl
- Upload date:
- Size: 244.7 kB
- Tags: CPython 3.10, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
0ab41c6a8a53a7bc2c14a5ca42d0ec0d04046d68ac6d3c409d06b8da12c5ca72
|
|
| MD5 |
4b5122c7237f204b484506d37f687d16
|
|
| BLAKE2b-256 |
d12d91107426dfeb4d63fe26f4208d3b57ead7b07c0249b68b31b89a6a846a3c
|
File details
Details for the file pygraph_sp-2026.3-cp39-none-win_amd64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp39-none-win_amd64.whl
- Upload date:
- Size: 179.9 kB
- Tags: CPython 3.9, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6b538886ec1a807e6d8b59b08dd8ea85eb88c32bedc76af2b19cca1ef9079c93
|
|
| MD5 |
82356998a7efb3e1b427dabbfc054be4
|
|
| BLAKE2b-256 |
9e80176e80030d1b488bf2d9cc1ff7399a929e20340348035dce9654132fb179
|
File details
Details for the file pygraph_sp-2026.3-cp39-cp39-manylinux_2_34_x86_64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp39-cp39-manylinux_2_34_x86_64.whl
- Upload date:
- Size: 338.2 kB
- Tags: CPython 3.9, manylinux: glibc 2.34+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
446c82e53edae9f2c8be723cfa94d2f635585ebbb300ccdd6587c40d768d461e
|
|
| MD5 |
9d8393e2a30fdbaf7b672ec1f007fb5f
|
|
| BLAKE2b-256 |
0961829e4eb7029422703c81892b96d3918dc92362bef6e7f09fb0bb09f6ee0b
|
File details
Details for the file pygraph_sp-2026.3-cp39-cp39-macosx_11_0_arm64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp39-cp39-macosx_11_0_arm64.whl
- Upload date:
- Size: 245.2 kB
- Tags: CPython 3.9, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1ed4a9761bda95b2be840ad58b6ae736646fb75a4907ba89225b4749c1bdb70d
|
|
| MD5 |
2462098d513c799e20561b6e9e5d7c76
|
|
| BLAKE2b-256 |
bde6f4a5cac08d8e5d51c2d48ecfae3530359c6722136f0dcfa9505106e73e8a
|
File details
Details for the file pygraph_sp-2026.3-cp38-none-win_amd64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp38-none-win_amd64.whl
- Upload date:
- Size: 179.6 kB
- Tags: CPython 3.8, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2d9fbd84c34678018f95e28fe22564b260dcb6f8b9d6928f561a98e360c071d9
|
|
| MD5 |
8b01acb9d023070b3fdf1664ce25c733
|
|
| BLAKE2b-256 |
fc822c7ff6409dd5d3a93f2f38acbdbddecbb6fdc4df72f456a41c27021dd517
|
File details
Details for the file pygraph_sp-2026.3-cp38-cp38-manylinux_2_34_x86_64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp38-cp38-manylinux_2_34_x86_64.whl
- Upload date:
- Size: 337.8 kB
- Tags: CPython 3.8, manylinux: glibc 2.34+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
5bdc577eb62ad22890ad7551bff80c502e30a37e4e831be63f9ed577c6b29f70
|
|
| MD5 |
4e511b337ffc35dc7b2b48945efa1fc4
|
|
| BLAKE2b-256 |
fa2b769f5ad0487d11d96c61ff2458333343cc82341336cab43ef7d142d310ae
|
File details
Details for the file pygraph_sp-2026.3-cp38-cp38-macosx_11_0_arm64.whl.
File metadata
- Download URL: pygraph_sp-2026.3-cp38-cp38-macosx_11_0_arm64.whl
- Upload date:
- Size: 245.0 kB
- Tags: CPython 3.8, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.19
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
1fac742332fffc3d6ab32c74ace219e598677a7ad720afe14b3d922a547ee9ff
|
|
| MD5 |
cb357c1fa000082a5f8bb6686c0ded2f
|
|
| BLAKE2b-256 |
74f0332cb136996eadffe6c91a4f591a87ebad49b9b30c5a0b19905597f6d372
|