Electrophysiology Visualization Suite
Project description
Synaptipy
Open-source electrophysiology analysis for wet-lab neuroscientists - no coding required.
Full documentation: synaptipy.readthedocs.io
Synaptipy is a cross-platform desktop application that turns raw patch-clamp recordings into publication-ready measurements. Load any .abf, .wcp, .nwb, or other supported file, and within seconds you can extract resting membrane potential, input resistance, action-potential features, synaptic event kinetics, and more - all from a point-and-click GUI with no Python knowledge required.
When you are ready to scale up, the same analysis pipeline runs automatically across hundreds of files in batch mode, and all results export to CSV or NWB for downstream use in Excel, R, or Python.
Quick Start - from install to first result in 3 steps
For Experimentalists
No Python required. Point-and-click GUI for wet-lab patch-clamp analysis.
Step 1: Download and install
No Python needed. Download the pre-compiled application for your operating system from the Releases page:
- Windows - run
Synaptipy_Setup_v0.1.1b6.exe - macOS - open
Synaptipy_v0.1.1b6.dmgand drag to Applications - Linux -
chmod +x Synaptipy-v0.1.1b6-x86_64.AppImagethen run it
Step 2: Load your recording
Launch Synaptipy and drag-and-drop your recording file (.abf, .wcp, .nwb, or any supported format) into the Explorer tab. Traces render immediately.
Step 3: Analyse
Click the Analyser tab. Select a channel, pick an analysis (e.g. Input Resistance or Spike Detection), and click Run. Results appear in the table below the plot and can be exported to CSV with one click.
What can Synaptipy measure?
Intrinsic membrane properties (passive tab)
- Resting membrane potential (RMP) - mean or median over a quiescent window
- Input resistance (Rin) - automatically detects the current-step edges; falls back gracefully if auto-detection fails
- Membrane time constant (Tau) - single-exponential fit to the voltage decay after a current step
- Sag ratio (Ih) - peak-to-steady-state hyperpolarisation ratio; includes rebound depolarisation
- I-V curve - current-voltage relationship across a multi-trial step protocol
- Membrane capacitance - from Tau/Rin in current-clamp or from capacitive-transient integration in voltage-clamp
Action potential features (spike analysis tab)
- Spike detection - threshold-crossing detection with refractory-period filtering
- Per-spike features - amplitude, half-width, rise time, decay time, threshold voltage, fAHP, mAHP
- Phase-plane analysis - dV/dt vs. voltage trajectory; threshold voltage via kink-slope criterion
Excitability (excitability tab)
- F-I curve - rheobase, slope, maximum firing frequency, and spike-frequency adaptation ratio
- Burst analysis - burst count, spikes per burst, burst duration, intra-burst frequency
- Spike-train statistics - mean ISI, CV, local variation (LV), CV2
Synaptic events (synaptic events tab)
- Threshold detection - prominence-based, baseline-drift-tolerant; click to accept/reject events
- Template matching - matched-filter cross-correlation using a bi-exponential kernel (rise/decay tau configurable); a bank of three kernels scaled at 1x, 2x, and 3x the decay constant provides dendritic-filtering tolerance
- Baseline-to-peak - amplitude and kinetics for evoked or spontaneous events
Optogenetics (opto tab)
- TTL correlation - latency, response probability, and jitter between optical stimulus and response
Batch processing
Repeat any analysis across all files in a folder automatically:
- Open the Batch tab, add your files, configure the pipeline
- Click Run - the GUI stays responsive while analysis runs in the background
- Export the complete results table to CSV
# Or run headlessly from a script:
from Synaptipy.core.analysis.batch_engine import BatchAnalysisEngine
from pathlib import Path
engine = BatchAnalysisEngine()
results = engine.run_batch(
[Path("recording.abf")],
[{"analysis": "spike_detection", "scope": "all_trials",
"params": {"threshold": -20.0, "refractory_ms": 2.0}}],
)
print(results)
FAIR data compliance - NWB export
Synaptipy exports raw traces, stimulus waveforms, and analysis results to Neurodata Without Borders (NWB) 2.x, ensuring your data meets FAIR (Findable, Accessible, Interoperable, Reusable) requirements for journal submission and data sharing.
What is exported:
- Raw electrophysiology traces as
CurrentClampSeries/VoltageClampSeries - Automated protocol metadata synthesis for missing stimulus arrays — when the
command channel is absent, stimulus waveforms are reconstructed from ABF
epoch metadata; a 3-step fallback (raw → synthetic →
stimulus=Nonewith a warning) ensures NWB conformance for every recording - Full embedded discrete event analysis via NWB 2.x Processing Modules —
spike times, synaptic event times, and amplitudes are written as
DynamicTableobjects inside aProcessingModulewhen the batch engine produces_raw_arraysoutput - Sweep-level organisation via
IntracellularRecordingsTable,SimultaneousRecordingsTable, andSequentialRecordingsTable(NWB 2.x icephys best-practice hierarchy) - Electrode metadata and session provenance fields
Visual validation
Every analysis result can be inspected visually before export:
- OpenGL-accelerated trace rendering (handles multi-million-sample recordings at interactive frame rates)
- Interactive zooming, panning, and per-channel amplitude scaling
- Grand-average overlay across any combination of files and trials
- Popup plots for I-V curves, F-I curves, and phase planes
Supported file formats
File I/O is handled through the Neo library:
| Format | Extension(s) | Acquisition system |
|---|---|---|
| Axon Binary Format | .abf |
Axon / Molecular Devices |
| WinWCP | .wcp |
Strathclyde Electrophysiology Software |
| CED / Spike2 | .smr, .smrx |
Cambridge Electronic Design |
| Igor Pro | .ibw, .pxp |
WaveMetrics |
| Intan | .rhd, .rhs |
Intan Technologies |
| Neurodata Without Borders | .nwb |
NWB standard |
| BrainVision | .vhdr |
Brain Products |
| European Data Format | .edf |
EDF/EDF+ |
| Plexon | .plx, .pl2 |
Plexon |
| Open Ephys | .continuous, .oebin |
Open Ephys |
| Tucker Davis Technologies | .tev, .tbk |
TDT |
| Neuralynx | .ncs, .nse, .nev |
Neuralynx |
| NeuroExplorer | .nex |
NeuroExplorer |
| MATLAB | .mat |
- |
| ASCII / CSV | .txt, .csv, .tsv |
- |
Any format supported by Neo but not listed above can be added via the IODict in the infrastructure layer.
Installing from source (developers and power users)
For Computational Modelers
Python API, headless batch processing, plugin development, and technical architecture.
If you want to use Synaptipy programmatically, write custom plugins, or contribute to development, install from source:
git clone https://github.com/anzalks/synaptipy.git
cd synaptipy
conda env create -f environment.yml
conda activate synaptipy
pip install -e ".[dev]"
python -m pytest # verify installation
synaptipy # launch the GUI
Documentation
Contributing
Contributions are welcome - whether adding a new analysis module, supporting an additional file format, or improving documentation. See CONTRIBUTING.md and the developer guide for project conventions and the contribution workflow.
For developers - architecture and plugin system
Click to expand technical details
Architecture overview
Synaptipy follows a strict separation-of-concerns design:
- Core layer - pure Python analysis logic, fully decoupled from the GUI and independently testable
- Application layer - PySide6 (Qt6) user interface and plugin manager
- Infrastructure layer - file I/O via Neo and PyNWB; NWB export
| Component | Technology | Version |
|---|---|---|
| Language | Python | 3.10 - 3.12 |
| GUI Framework | PySide6 | 6.7.3 (pinned) |
| Plotting Engine | PyQtGraph | 0.13.0+ |
| Electrophysiology I/O | Neo | 0.14.0+ |
| NWB Export | PyNWB | 3.1.0+ |
| Numerical Computation | SciPy / NumPy | 1.13.0+ / 2.0.0+ |
Analysis registry pattern
New analysis functions are registered with the @AnalysisRegistry.register decorator. The ui_params list drives the GUI parameter panel automatically, and the same parameters serialise directly to the batch engine - there is no separate configuration step.
@AnalysisRegistry.register(
name="my_analysis",
ui_params=[{"name": "threshold", "type": "float", "default": -20.0}],
plots=["overlay"],
)
def my_analysis_wrapper(data, time, fs, params):
...
return {"module_used": "my_analysis", "metrics": {"threshold_mv": threshold}}
Plugin interface
Any Python script placed in ~/.synaptipy/plugins/ that uses @AnalysisRegistry.register is automatically discovered at startup and available in both the interactive analyser and batch pipeline.
A fully documented template lives at src/Synaptipy/templates/analysis_template.py.
Return schema - every wrapper must return:
return {
"module_used": "my_plugin",
"metrics": {"Val1": 1.0, "Val2": 2.0},
}
Private keys (prefixed with _) pass data to plot overlays without appearing in the results table.
Hot-reload - toggling "Enable Custom Plugins" in Edit > Preferences reloads all plugins and regenerates the UI without restarting the application.
Cross-file trial averaging
While in "Cycle Single Trial" mode, click "Add Current Trial to Avg Set" to capture a trial. Navigate to other files and continue adding trials. Enable "Plot Selected Avg" to overlay the grand average.
Shape mismatch is handled by NaN-padding shorter arrays and computing the column-wise nanmean, so recordings of different durations average correctly without truncation.
Dependencies and citations
Synaptipy builds on the following open-source libraries. If you use Synaptipy in published research, please consider citing the relevant upstream packages.
| Library | Role | Citation |
|---|---|---|
| Neo | Electrophysiology file I/O | Garcia S et al. (2014). Front. Neuroinformatics 8:10. doi:10.3389/fninf.2014.00010 |
| PyNWB | NWB data export | Rubel O et al. (2022). eLife 11:e78362. doi:10.7554/eLife.78362 |
| PySide6 | Qt6 GUI framework | Qt for Python, The Qt Company |
| PyQtGraph | Signal plotting | Campagnola L et al. https://www.pyqtgraph.org |
| SciPy | Signal processing and curve fitting | Virtanen P et al. (2020). Nature Methods 17:261-272. doi:10.1038/s41592-019-0686-2 |
| NumPy | Array computation | Harris CR et al. (2020). Nature 585:357-362. doi:10.1038/s41586-020-2649-2 |
License
Synaptipy is free and open-source software licensed under the GNU Affero General Public License v3 (AGPLv3). See the LICENSE file for full terms.
Open-Source Electrophysiology Visualization and Analysis Suite
Full documentation: synaptipy.readthedocs.io
Synaptipy is a cross-platform, open-source application for the visualization and analysis of electrophysiological recordings. It is designed around a modular, extensible architecture that supports interactive single-recording analysis, large-scale batch processing, and integration of custom user-written analysis routines via a plugin interface. The primary focus is whole-cell patch-clamp and intracellular recordings; however, any electrophysiology signal whose file format is supported by the Neo I/O library can be loaded, visualised, and processed - including extracellular, sharp-electrode, and multi-channel recordings. File-format support is therefore not a limitation of Synaptipy itself but of the underlying Neo reader for a given format.
Analysis Capabilities
Synaptipy provides 15 built-in analysis routines organised into five core module tabs, each available interactively in the GUI and as a composable unit in the batch processing pipeline.
Tab 1: Intrinsic Properties
- Baseline (RMP) - mean or median membrane potential measured over a user-defined quiescent window
- Input Resistance - delta-V / delta-I from a voltage response to a hyperpolarising current step; auto-detects step edges from the stimulus derivative and falls back gracefully when auto-detection fails
- Tau (Time Constant) - single-exponential fit to the voltage decay after a current step; returns NaN with a clear error flag when the fit fails
- Sag Ratio (Ih) - quantifies hyperpolarisation-activated sag as the peak-to-steady-state voltage ratio; includes rebound depolarisation measurement after stimulus offset
- I-V Curve - current-voltage relationship across a multi-trial step protocol; fits aggregate Rin from the slope
- Capacitance - membrane capacitance derived from Tau/Rin in current-clamp, or from capacitive-transient integration in voltage-clamp
Tab 2: Spike Analysis
- Spike Detection - threshold-crossing AP detection with refractory period filtering; extracts per-spike amplitude, half-width, rise time, decay time, threshold voltage, and after-hyperpolarisation (AHP)
- Phase Plane - dV/dt vs. voltage trajectory for AP initiation dynamics; detects threshold via a kink-slope criterion and reports mean threshold voltage and maximum dV/dt
Tab 3: Excitability
- Excitability (F-I Curve) - multi-trial rheobase, F-I slope, maximum firing frequency, and spike-frequency adaptation ratio; generates a popup F-I scatter plot
- Burst Analysis - max-ISI burst detection; reports burst count, mean spikes per burst, mean burst duration, and intra-burst frequency
- Spike Train Dynamics - ISI statistics including mean ISI, coefficient of variation (CV), local variation (LV), and CV2; generates a popup ISI plot
Tab 4: Synaptic Events
- Event Detection (Threshold) - prominence-based threshold detection that accommodates shifting baselines and overlapping events; interactive event markers with click-to-remove and Ctrl+click-to-add
- Event (Template Match) - matched-filter cross-correlation using a bi-exponential kernel (user-defined rise/decay tau); a fixed bank of three kernels scaled at 1x, 2x, and 3x the decay constant provides tolerance for dendritic filtering
- Event (Baseline Peak) - direct baseline-to-peak amplitude detection with kinetics estimation for evoked or spontaneous events
Tab 5: Optogenetics
- Optogenetic Synchronisation - extracts TTL/digital stimulus pulses from a secondary channel and correlates them with spikes or synaptic events to compute optical latency, response probability, and jitter
Extensibility and Plugin Interface
Synaptipy is built around a central AnalysisRegistry that maps named analysis functions to the GUI and batch engine via a decorator. Any Python script placed in ~/.synaptipy/plugins/ that uses the @AnalysisRegistry.register decorator is automatically discovered at startup and made available in both the interactive analyser and the batch processing pipeline - no modification to the core package is required.
A fully documented template (src/Synaptipy/templates/analysis_template.py) defines the required function signature and return types, enabling researchers to integrate custom algorithms without any knowledge of the GUI internals.
Hot-Reloadable Plugin Ecosystem
Plugins are first-class citizens, not an afterthought. The plugin system is designed for zero-friction iteration:
- No restart required for toggling: When the user checks or unchecks "Enable Custom Plugins" in Edit > Preferences, Synaptipy calls
PluginManager.reload_plugins()to purge all plugin-contributed entries from theAnalysisRegistryand re-execute every plugin file discovered on disk. It then callsAnalyserTab.rebuild_analysis_tabs()to regenerate the entire Analyser tab UI from the updated registry - all within the running process, with no application restart needed. - Scoped unregistration: Only plugin-sourced analyses (those flagged as
source="plugin"in registry metadata) are removed during a reload. Built-in analyses are untouched. - Two discovery paths:
examples/plugins/is scanned first (bundled examples), then~/.synaptipy/plugins/(user additions). A user copy with the same filename always takes precedence, enabling personalised variants without modifying the Synaptipy installation. - Crash isolation: A syntax error or import failure in one plugin is caught and logged; remaining plugins still load and appear in the UI.
Plug & Play Data Export: The batch engine processes custom plugin outputs dynamically. Any key in the metrics dict returned by a plugin wrapper automatically generates its own CSV column during batch export.
Plugin Return Schema: Every wrapper function must return a nested dict:
return {
"module_used": "my_plugin",
"metrics": {"Val1": 1.0, "Val2": 2.0},
}
Private keys (prefixed with _) pass data to plot overlays without appearing in the results table. See docs/extending_synaptipy.md for the full specification.
Supported File Formats
File I/O is handled through the Neo library, giving Synaptipy broad compatibility across acquisition systems:
| Format | Extension(s) | Acquisition System |
|---|---|---|
| Axon Binary Format | .abf |
Axon / Molecular Devices |
| WinWCP | .wcp |
Strathclyde Electrophysiology Software |
| CED / Spike2 | .smr, .smrx |
Cambridge Electronic Design |
| Igor Pro | .ibw, .pxp |
WaveMetrics |
| Intan | .rhd, .rhs |
Intan Technologies |
| Neurodata Without Borders | .nwb |
NWB standard |
| BrainVision | .vhdr |
Brain Products |
| European Data Format | .edf |
EDF/EDF+ |
| Plexon | .plx, .pl2 |
Plexon |
| Open Ephys | .continuous, .oebin |
Open Ephys |
| Tucker Davis Technologies | .tev, .tbk |
TDT |
| Neuralynx | .ncs, .nse, .nev |
Neuralynx |
| NeuroExplorer | .nex |
NeuroExplorer |
| MATLAB | .mat |
- |
| ASCII / CSV | .txt, .csv, .tsv |
- |
Any format supported by Neo but not listed above can be made available by adding a corresponding entry to the IODict in the infrastructure layer.
Visualization
- OpenGL-accelerated trace rendering via PyQtGraph capable of displaying multi-million sample recordings at interactive frame rates
- Tree-based multi-file explorer with synchronised analysis view
- Interactive zooming, panning, and per-channel scaling
- Batch result overlays and popup plots (I-V curves, F-I curves, phase planes) generated directly within the GUI
Cross-File Trial Averaging
Synaptipy's Explorer tab supports manual grand-average construction across any combination of files and trials:
- While in "Cycle Single Trial" mode, click "Add Current Trial to Avg Set" to capture the currently displayed trial (including any active preprocessing pipeline transforms).
- Navigate freely to other files - even files of different durations or recording protocols - and continue adding trials. The global selection accumulates across the entire session.
- Enable "Plot Selected Avg" to overlay the grand average on the current recording view.
- Shape-mismatch safety: When computing the average, each trial is accumulated with
sum[:min_len] += trial[:min_len]wheremin_len = min(len(accumulator), len(trial)). This dynamically truncates to the shortest array in the selection, preventing NumPy broadcast errors when recordings have different durations. The resulting average is plotted against the time vector of the first trial added. - The selection persists until the user clears it, enabling iterative comparison as new files are loaded.
Batch Processing
- Composable pipeline architecture: chain any registered analysis steps in sequence
- Background execution in worker threads - the GUI remains responsive during batch runs
- Automatic metadata extraction (sampling rate, gain, recording datetime)
- Results exported to CSV (wide-format scalars + long-format event arrays), compatible with Python/Pandas, R, and MATLAB
- NWB 2.x export with icephys sweep tables, 3-step stimulus fallback, and discrete-event
ProcessingModulefor FAIR data archival
Technical Architecture
Synaptipy follows a strict separation-of-concerns design:
- Core layer - pure Python analysis logic, fully decoupled from the GUI and independently testable
- Application layer - PySide6 (Qt6) user interface and plugin manager
- Infrastructure layer - file I/O via Neo and PyNWB; NWB export
| Component | Technology | Version |
|---|---|---|
| Language | Python | 3.10 - 3.12 |
| GUI Framework | PySide6 | 6.7.3 (pinned) |
| Plotting Engine | PyQtGraph | 0.13.0+ |
| Electrophysiology I/O | Neo | 0.14.0+ |
| NWB Export | PyNWB | 3.1.0+ |
| Numerical Computation | SciPy / NumPy | 1.13.0+ / 2.0.0+ |
Installation
Synaptipy is available both as a standalone application and as a Python package.
Download and Installation
You do not need to install Python or any dependencies to run Synaptipy. We provide pre-compiled, standalone applications for all major operating systems.
Download the Beta Release Here
Choose the correct file for your operating system from the release assets:
- Windows: Download
Synaptipy_Setup_v0.1.1b6.exeand run the installer. - macOS: Download
Synaptipy_v0.1.1b6.dmg, open the disk image, and drag Synaptipy to your Applications folder. - Linux: Download
Synaptipy-v0.1.1b6-x86_64.AppImage, make the file executable (chmod +x Synaptipy-v0.1.1b6-x86_64.AppImage), and run it directly.
Python Package Installation
For researchers who wish to use Synaptipy programmatically or develop custom plugins, you can install it via conda / pip:
Install from TestPyPI (pre-release beta)
Note: TestPyPI is used for pre-release testing. Most dependencies (scipy, numpy, etc.) are not available there, so you must include
--extra-index-urlto pull them from real PyPI:
pip install \
-i https://test.pypi.org/simple/ \
--extra-index-url https://pypi.org/simple/ \
synaptipy
Without --extra-index-url, pip will fail trying to build scipy from source using TestPyPI-only packages.
Prerequisites
Setup Instructions
- Clone the Repository
git clone https://github.com/anzalks/synaptipy.git
cd synaptipy
- Create the Environment
This step installs Python and all required system dependencies defined in
environment.yml.
conda env create -f environment.yml
- Activate the Environment
conda activate synaptipy
- Install the Application Install the package in editable mode to allow for local development.
pip install -e ".[dev]"
Verification
To verify the installation, execute the comprehensive test suite:
python -m pytest
Quick Start
Get from installation to your first analysis in under 60 seconds:
-
Launch the application from your terminal:
synaptipy
-
Load a recording - drag and drop an
.abf,.nwb,.wcp, or any supported file into the Explorer tab. The traces render immediately with OpenGL-accelerated plotting. -
Analyse - click the Analyser tab. Select a channel, choose an analysis (e.g., Spike Detection or Input Resistance), adjust parameters if needed, and click Run. Results appear in the table below the plot and can be exported to CSV.
For headless / scripted use, the batch engine works without the GUI:
from Synaptipy.core.analysis.batch_engine import BatchAnalysisEngine
from pathlib import Path
engine = BatchAnalysisEngine()
results = engine.run_batch(
[Path("recording.abf")],
[{"analysis": "spike_detection", "scope": "all_trials",
"params": {"threshold": -20.0, "refractory_ms": 2.0}}],
)
print(results)
Usage
Graphical Interface
Launch the main application window:
synaptipy
Alternatively, run the module directly:
python -m Synaptipy.application
Programmatic Analysis
The core analysis engine can be utilized in scripts for headless processing:
from Synaptipy.core.analysis.batch_engine import BatchAnalysisEngine
from pathlib import Path
# Initialize the Analysis Engine
engine = BatchAnalysisEngine()
# Define an Analysis Pipeline
pipeline = [
{
'analysis': 'spike_detection',
'scope': 'all_trials',
'params': {'threshold': -20.0, 'refractory_ms': 2.0}
}
]
# Execute on Data Files
file_path = Path("data/example_recording.abf")
results = engine.run_batch([file_path], pipeline)
print(results)
Documentation
Full API reference, tutorials, and the developer guide are hosted on ReadTheDocs:
Contributing
Collaborations and contributions are welcome. Whether you are adding a new analysis module, supporting an additional file format, or improving the documentation, please refer to the developer guide for project structure, coding standards, and the contribution workflow. The plugin interface provides the lowest-friction path to integrating custom analysis routines.
Dependencies and Citations
Synaptipy builds on the following open-source libraries. If you use Synaptipy in published research, please consider citing the relevant upstream packages alongside the Synaptipy repository.
| Library | Role in Synaptipy | Citation |
|---|---|---|
| Neo | Electrophysiology file I/O (all supported formats) | Garcia S et al. (2014). Neo: an object model for handling electrophysiology data in multiple formats. Front. Neuroinformatics 8:10. doi:10.3389/fninf.2014.00010 |
| PyNWB | NWB data export | Rubel O et al. (2022). The Neurodata Without Borders ecosystem for neurophysiological data science. eLife 11:e78362. doi:10.7554/eLife.78362 |
| PySide6 | Qt6 GUI framework | Qt for Python, The Qt Company. https://doc.qt.io/qtforpython/ |
| PyQtGraph | OpenGL-accelerated signal plotting | Campagnola L et al. PyQtGraph. https://www.pyqtgraph.org |
| SciPy | Signal processing and numerical fitting | Virtanen P et al. (2020). SciPy 1.0: Fundamental algorithms for scientific computing in Python. Nature Methods 17:261-272. doi:10.1038/s41592-019-0686-2 |
| NumPy | Array computation | Harris CR et al. (2020). Array programming with NumPy. Nature 585:357-362. doi:10.1038/s41586-020-2649-2 |
License
Synaptipy is free and open-source software licensed under the GNU Affero General Public License v3 (AGPLv3). See the LICENSE file for full terms.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file synaptipy-0.1.1b7.tar.gz.
File metadata
- Download URL: synaptipy-0.1.1b7.tar.gz
- Upload date:
- Size: 29.0 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
513b20fdd9866497188e092932b98270360862b258dbc65e77475a3471b054f2
|
|
| MD5 |
5e47f24cd67209f443f0ee002ac443b0
|
|
| BLAKE2b-256 |
27398dffc1187d505c98d092ba15e48a87a7395a556adaaeb0cbe79ebb9129bc
|
Provenance
The following attestation bundles were made for synaptipy-0.1.1b7.tar.gz:
Publisher:
release.yml on anzalks/synaptipy
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
synaptipy-0.1.1b7.tar.gz -
Subject digest:
513b20fdd9866497188e092932b98270360862b258dbc65e77475a3471b054f2 - Sigstore transparency entry: 1415466263
- Sigstore integration time:
-
Permalink:
anzalks/synaptipy@f2f2ba6afd0acaaee4b80ceec2f060bc67f7dd61 -
Branch / Tag:
refs/tags/v0.1.1b7 - Owner: https://github.com/anzalks
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@f2f2ba6afd0acaaee4b80ceec2f060bc67f7dd61 -
Trigger Event:
push
-
Statement type:
File details
Details for the file synaptipy-0.1.1b7-py3-none-any.whl.
File metadata
- Download URL: synaptipy-0.1.1b7-py3-none-any.whl
- Upload date:
- Size: 16.0 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
f0a8951c37f8f866b7e2fc9b9b6dcff94693a004761baa9d640452a49b110e19
|
|
| MD5 |
ff39ebdc460b994eb7ff3d6903725851
|
|
| BLAKE2b-256 |
acd5e671f2e455d46d742a086f2c65708e78f293267373f5abe019f9130a5b69
|
Provenance
The following attestation bundles were made for synaptipy-0.1.1b7-py3-none-any.whl:
Publisher:
release.yml on anzalks/synaptipy
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
synaptipy-0.1.1b7-py3-none-any.whl -
Subject digest:
f0a8951c37f8f866b7e2fc9b9b6dcff94693a004761baa9d640452a49b110e19 - Sigstore transparency entry: 1415466330
- Sigstore integration time:
-
Permalink:
anzalks/synaptipy@f2f2ba6afd0acaaee4b80ceec2f060bc67f7dd61 -
Branch / Tag:
refs/tags/v0.1.1b7 - Owner: https://github.com/anzalks
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
release.yml@f2f2ba6afd0acaaee4b80ceec2f060bc67f7dd61 -
Trigger Event:
push
-
Statement type: