Lightweight tools for loading and handling photon counting data
Project description
photon-tools
Lightweight Python tools for loading, inspecting, and screening single-molecule photon counting data.
photon-tools is designed for interactive, notebook-based workflows commonly used in single-molecule fluorescence experiments.
The focus is on data loading, standardization, and visual inspection, not on enforcing a specific analysis pipeline.
photon-tools provides:
- a clean and extensible data loading layer
- a standardized in-memory data model
- fast, interactive Plotly-based previews
- a Jupyter-based browser for screening and annotating many files
Features
✔ Data loading
- Built-in loader for Photon-HDF5
- Unified data representation via
PhotonDataset/PhotonData - Optional runtime registration of custom loaders (no forking required)
✔ Clean data model
- Integer timestamps (ticks)
- Optional detector/channel information
- Explicit
timing_resolution(seconds per tick) - Safe conversion to physical time
- Easy splitting by detector channel
✔ Interactive preview (Plotly)
- Fast binning of large photon streams
- Multiple detector channels in one plot
- Clickable legend to enable/disable channels
- Scroll-wheel zoom + mouse pan
- Physically meaningful defaults (axes clamped to zero)
- Fully customizable via returned Plotly
Figure
✔ Screening workflow (Jupyter)
- Browse many files interactively
- Next / Previous navigation
- Visual evaluation instead of purely numeric filtering
- Mark files as keep / reject
- Store annotations and notes in a CSV file
Installation
Create a virtual environment (recommended):
python -m venv .venv
source .venv/bin/activate
Install in editable (development) mode:
pip install photon-tools
Required dependencies:
- numpy
- h5py
- plotly
- nbformat
- ipywidgets
- pandas
Basic Usage
notebooks/01_quickstart.ipynb
Load a Photon-HDF5 file
import photon_tools as pt
ds = pt.load(
"measurement.hdf5",
timing_resolution=5e-9, # seconds per tick
)
Access data
ds.photons.timestamps # raw integer timestamps (ticks)
ds.photons.times_s # physical time in seconds
ds.photons.detectors # detector/channel IDs
Split by detector channel:
by_ch = ds.photons.by_detector()
t_ch0 = by_ch[0]
t_ch1 = by_ch[1]
Interactive Preview
Quick visual inspection of a time trace:
pt.preview(ds, bin_width_ms=10)
Customize appearance and detector labels:
pt.preview(
ds,
bin_width_ms=5,
detector_labels={0: "donor", 1: "acceptor"},
colors={0: "royalblue", 1: "firebrick"},
width=1000,
height=400,
)
Further customization via Plotly:
fig = pt.preview(ds, show=False)
fig.update_yaxes(type="log")
fig.show(config={"scrollZoom": True})
Because preview() returns a Plotly Figure, all Plotly features remain available.
Screening Many Files (Notebook Browser)
notebooks/02_screening_browser.ipynb
The browser allows you to:
- step through many measurement files
- inspect traces interactively (zoom, pan, toggle channels)
- mark files as keep or reject
- add free-text notes
- store all annotations in a CSV file
This workflow is intended for expert-driven screening, where visual judgment is essential and cannot be replaced by scalar metrics alone.
Custom Loaders
notebooks/03_custom_loaders.ipynb
Custom file formats can be supported without modifying or forking the package.
Define a loader function and register it at runtime:
def my_loader(path):
...
return PhotonDataset(...)
pt.register_loader(".dat", loader=my_loader)
ds = pt.load("custom_format.dat")
This allows extending photon-tools in notebooks or scripts in a lightweight and flexible way.
Files without extensions
Some binary formats (e.g. custom NI time-tagged data) do not use file extensions. In this case, the loader must be specified explicitly:
ds = pt.load(
"measurement_001",
loader=pt.load_ni_binary,
timing_resolution=10e-9,
)
Data Model
photon-tools uses a small, explicit, immutable data model to represent
photon-counting data and scan images in memory.
The goal of this model is not to mirror file formats, but to provide a stable and analysis-friendly abstraction layer that decouples:
- file I/O
- experimental setup specifics
- downstream analysis and visualization
All loaders (built-in or custom) convert raw data into this common model.
PhotonDataset
PhotonDataset is the top-level container returned by all loaders.
PhotonDataset(
photons: PhotonData | None,
images: dict[str, ImageData],
meta: dict[str, Any],
raw: dict[str, Any],
source: str | None,
)
Fields:
-
photons
Time-tagged photon data (PhotonData).
May beNonefor pure image data. -
images
Mapping of image identifiers toImageDataobjects
(e.g."scan","preview","apd_sum"). -
meta
High-level, standardized metadata (sample name, setup, excitation power, etc.). -
raw
Loader-specific metadata and diagnostics
(file paths, header dumps, original parameters). -
source
Optional string identifying the data source
(file path, measurement ID, etc.).
PhotonDataset is intentionally lightweight and does not enforce
any analysis workflow.
PhotonData
PhotonData represents a single stream of photon arrival times.
PhotonData(
timestamps: np.ndarray,
detectors: np.ndarray | None = None,
nanotimes: np.ndarray | None = None,
timing_resolution: float | None = None,
unit: str = "ticks",
)
Concepts:
-
timestamps
Integer macro-times (usually hardware clock ticks). -
timing_resolution
Seconds per tick (e.g.5e-9).
Required to convert timestamps into physical time. -
detectors
Optional detector/channel assignment per photon
(0, 1, 2, ...).
If absent, the data is treated as a single channel. -
nanotimes
Optional microtime / TCSPC information (same length astimestamps).
Key properties & helpers:
ds.photons.times_s # timestamps converted to seconds
ds.photons.by_detector() # split timestamps by detector ID
Design notes:
PhotonDatais immutable.- No implicit unit conversions.
- Missing information (e.g. timing resolution) raises explicit errors.
ImageData
ImageData represents multi-channel 2D scan images.
ImageData(
channels: Mapping[str, np.ndarray],
meta: ImageMeta,
raw: dict[str, Any] = {},
)
Fields:
-
channels
Dictionary mapping channel names to 2D arrays
(e.g."detector0","detector1","sum"). -
meta
Physical scan metadata (ImageMeta). -
raw
Loader-specific information (binary headers, parsing details).
All image channels must:
- be 2D
- share the same shape
ImageMeta
ImageMeta stores physical scan parameters in explicit units.
ImageMeta(
n_pixels_x: int,
n_pixels_y: int,
range_x_um: float,
range_y_um: float,
offset_x_um: float = 0.0,
offset_y_um: float = 0.0,
pixel_dwell_time_s: float | None = None,
)
This allows downstream code to:
- display axes in µm
- compute pixel sizes
- remain independent of scan file formats
Writing Custom Loaders
Custom loaders should:
- Parse the raw file format
- Convert data into
PhotonDataand/orImageData - Populate
metaandrawdictionaries as needed - Return a
PhotonDataset
Minimal example:
def my_loader(path, **kwargs):
photons = PhotonData(
timestamps=ts,
detectors=det,
timing_resolution=5e-9,
)
return PhotonDataset(
photons=photons,
meta={"format": "custom"},
raw={"path": str(path)},
source=str(path),
)
The loader does not need to:
- perform binning
- normalize data
- apply analysis logic
Those steps are intentionally left to the user.
Data Model Diagram
PhotonDataset
├─ photons: PhotonData | None
│ ├─ timestamps: int ticks (N)
│ ├─ detectors: int ids (N) | None
│ ├─ nanotimes: int microtimes (N) | None
│ └─ timing_resolution: seconds per tick | None
│
├─ images: dict[str, ImageData]
│ └─ ImageData
│ ├─ channels: {name -> 2D array (H,W)} (all same shape)
│ ├─ meta: ImageMeta (µm + seconds)
│ └─ raw: loader-specific dict
│
├─ meta: dict (standardized, analysis-friendly)
├─ raw: dict (loader-specific diagnostics)
└─ source: str | None
Design Philosophy
- Notebook-first
- Explicit over implicit
- No silent assumptions
- Visual inspection before automation
- Keep the core lightweight; downstream analysis is user-specific
photon-tools is not an analysis framework — it is a foundation for interactive and exploratory workflows.
Status
This project is under active development and tailored to real experimental workflows.
APIs may evolve, but changes are made conservatively and with practical use cases in mind.
Todos
- add support for loading pixel images from binary files
- add ttl data
- add support for setup3
- naming convention: timestamps -> macro times vs micro times vs nano times -> tt vs mt vs ttl
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file photon_tools-0.1.1.tar.gz.
File metadata
- Download URL: photon_tools-0.1.1.tar.gz
- Upload date:
- Size: 24.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
a91020a67c3f29bb201fad49ccc9f8fa68d1e05760ae075fde3b34111507c594
|
|
| MD5 |
72c8b07ac5f02389ee480702f41e25ac
|
|
| BLAKE2b-256 |
ca8f52a9254f2b5b57385f40a1500bb79d3412fa8b689a18589d0b3ff7f31b6e
|
Provenance
The following attestation bundles were made for photon_tools-0.1.1.tar.gz:
Publisher:
publish.yml on JKL453/photon-tools
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
photon_tools-0.1.1.tar.gz -
Subject digest:
a91020a67c3f29bb201fad49ccc9f8fa68d1e05760ae075fde3b34111507c594 - Sigstore transparency entry: 940510501
- Sigstore integration time:
-
Permalink:
JKL453/photon-tools@5638c92b2e9ea19eebc156e72ddd0d60f1b29c01 -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/JKL453
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5638c92b2e9ea19eebc156e72ddd0d60f1b29c01 -
Trigger Event:
push
-
Statement type:
File details
Details for the file photon_tools-0.1.1-py3-none-any.whl.
File metadata
- Download URL: photon_tools-0.1.1-py3-none-any.whl
- Upload date:
- Size: 24.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7b670aaba8f8ea2ea18cc3b274f698b575c521336d3f826e5ff99fbe1b5dfc48
|
|
| MD5 |
fc064d821ac8ddf5f44129387c9af6a3
|
|
| BLAKE2b-256 |
d01637fc5e0eb088a6120cc8fcf86fc37161edb5d077f5f8dc805f8ad363b1eb
|
Provenance
The following attestation bundles were made for photon_tools-0.1.1-py3-none-any.whl:
Publisher:
publish.yml on JKL453/photon-tools
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
photon_tools-0.1.1-py3-none-any.whl -
Subject digest:
7b670aaba8f8ea2ea18cc3b274f698b575c521336d3f826e5ff99fbe1b5dfc48 - Sigstore transparency entry: 940510510
- Sigstore integration time:
-
Permalink:
JKL453/photon-tools@5638c92b2e9ea19eebc156e72ddd0d60f1b29c01 -
Branch / Tag:
refs/tags/v0.1.1 - Owner: https://github.com/JKL453
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@5638c92b2e9ea19eebc156e72ddd0d60f1b29c01 -
Trigger Event:
push
-
Statement type: