Skip to main content

A Python toolkit to analzye photon timetrace data from qubit sensors

Project description

DOI PyPi version Python 3.10 Python 3.11

Qudi Hira Analysis

This toolkit automates a large portion of the work surrounding data analysis on quantum sensing experiments where the primary raw data extracted is photon counts.

The high level interface is abstracted, and provides a set of functions to automate data import, handling and analysis. It is designed to be exposed through Jupyter Notebooks, although the abstract interface allows it to be integrated into larger, more general frameworks as well (with only some pain). Using the toolkit itself should only require a beginner-level understanding of Python.

It also aims to improve transparency and reproducibility in experimental data analysis. In an ideal scenario, two lines of code are sufficient to recreate all output data.

Python offers some very handy features like dataclasses, which are heavily used by this toolkit. Dataclasses offer a full OOP (object-oriented programming) experience while analyzing complex data sets. They provide a solid and transparent structure to the data to reduce errors arising from data fragmentation. This generally comes at a large performance cost, but this is (largely) sidestepped by lazy loading data and storing metadata instead wherever possible.

Installation

pip install qudi-hira-analysis

Citation

If you are publishing scientific results, you can cite this work as: https://doi.org/10.5281/zenodo.7604670

Schema

The visual structure of the toolkit is shown in the schema below. It largely consists of three portions:

  • IOHandler assumes a central store of raw data, which is never modified (read-only)
  • DataHandler automates the extraction of large amounts of data from the IOHandler interface
  • AnalysisLogic contains a set of automated fitting routines using lmfit internally (built on top of fitting routines from the qudi project)

This license of this project is located in the top level folder under LICENSE. Some specific files contain their individual licenses in the file header docstring.

flowchart TD;
    IOHandler<-- Handle all IO operations -->DataLoader;
    DataLoader<-- Map IO callables to data -->DataHandler;
    DataHandler-- Structure extracted data -->MeasurementDataclass;
    MeasurementDataclass-- Plot fitted data --> Plot[Visualize data and add context in JupyterLab];
    Plot-- Save plotted data --> DataHandler;
    style MeasurementDataclass fill:#bbf,stroke:#f66,stroke-width:2px,color:#fff,stroke-dasharray: 5 5

Measurement Dataclass

flowchart LR;
    subgraph Standard Data
        MeasurementDataclass-->filepath1[filepath: Path];
        MeasurementDataclass-->data1[data: DataFrame];
        MeasurementDataclass-->params1[params: dict];
        MeasurementDataclass-->timestamp1[timestamp: datetime];
        MeasurementDataclass-- analysis --oAnalysisLogic;
    end
    subgraph Pulsed Data
        MeasurementDataclass-- pulsed --oPulsedMeasurementDataclass;
        PulsedMeasurementDataclass-- measurement --oPulsedMeasurement;
        PulsedMeasurement--> filepath2[filepath: Path];
        PulsedMeasurement--> data2[data: DataFrame];
        PulsedMeasurement--> params2[params: dict];
        PulsedMeasurementDataclass-- laser_pulses --oLaserPulses; 
        LaserPulses--> filepath3[filepath: Path];
        LaserPulses--> data3[data: DataFrame];
        LaserPulses--> params3[params: dict];
        PulsedMeasurementDataclass-- timetrace --oRawTimetrace;
        RawTimetrace--> filepath4[filepath: Path];
        RawTimetrace--> data4[data: DataFrame];
        RawTimetrace--> params4[params: dict];
    end

Supports common fitting routines

Fit routines included in AnalysisLogic

Dimension Fit
1d decayexponential
biexponential
decayexponentialstretched
gaussian
gaussiandouble
gaussianlinearoffset
hyperbolicsaturation
linear
lorentzian
lorentziandouble
lorentziantriple
sine
sinedouble
sinedoublewithexpdecay
sinedoublewithtwoexpdecay
sineexponentialdecay
sinestretchedexponentialdecay
sinetriple
sinetriplewithexpdecay
sinetriplewiththreeexpdecay
2d twoDgaussian

Inbuilt measurement tree visualizer

>>> tip_2S6 = DataHandler(data_folder="C:\\Data", figure_folder="C:\\QudiHiraAnalysis",
                      measurement_folder="20220621_FR0612-F2-2S6_uhv")
>>> tip_2S6.data_folder_tree()

# Output
├── 20211116_NetworkAnalysis_SampleIn_UpperPin.csv
├── 20211116_NetworkAnalysis_SampleOut_UpperPin.csv
├── 20211116_NetworkAnalysis_TipIn_LowerPin.csv
├── 20211116_NetworkAnalysis_TipIn_UpperPin.csv
├── 20211116_NetworkAnalysis_TipOut_LowerPin.csv
├── 20211116_NetworkAnalysis_TipOut_UpperPin.csv
├── ContactTestingMeasurementHead
│   ├── C2_Reference.txt
│   ├── C2_SampleLowerPin.txt
│   ├── C2_SampleUpperPin.txt
│   ├── C2_TipLowerPin.txt
│   └── C2_TipUpperPin.txt
├── Sample_MW_Pin_comparision.png
├── Tip_MW_Pin_comparision.png
└── Tip_Sample_MW_Pin_comparision.png

Automated data extraction

Example 1: Extract, fit and plot all Rabi measurements

from pathlib import Path
import matplotlib.pyplot as plt
import seaborn as sns

from qudi_hira_analysis import DataHandler

nv1 = DataHandler(
  data_folder=Path("C:\\", "Data"),
  figure_folder=Path("C:\\", "QudiHiraAnalysis"),
  measurement_folder=Path("20230101_NV1")
)

rabi_measurements = nv1.load_measurements(measurement_str="rabi", qudi=True, pulsed=True)

fig, ax = plt.subplots()

for rabi in rabi_measurements:
  sns.lineplot(data=rabi.data, x="Controlled variable(s)", y="Signal", ax=ax)
  fit_x, fit_y, result = rabi.analysis.fit(
    x="Controlled variable(s)", y="Signal",
    data=rabi.data,
    fit_function=rabi_measurements.sineexponentialdecay
  )
  sns.lineplot(x=fit_x, y=fit_y, ax=ax)

nv1.save_figures(filepath="rabi_variation", fig=fig)

Example 2: Combine all temperature data, plot and save

from pathlib import Path

import matplotlib.pyplot as plt
import pandas as pd
import seaborn as sns

from qudi_hira_analysis import DataHandler

nv1 = DataHandler(
  data_folder=Path("C:\\", "Data"),
  figure_folder=Path("C:\\", "QudiHiraAnalysis"),
  measurement_folder=Path("20230101_NV1"),
  copy_measurement_folder_structure=False
)

temperature_measurements = nv1.load_measurements(measurement_str="temperature-monitoring")

dft = pd.concat([t.data for t in temperature_measurements.values()])

fig, ax = plt.subplots()
sns.lineplot(data=dft, x="Time", y="Temperature", ax=ax)
nv1.save_figures(filepath="temperature-monitoring", fig=fig)

Build

Prerequisites

Latest version of:

  • Poetry (recommended) or conda package manager
  • git version control system

Clone the repository

git clone https://github.com/dineshpinto/qudi-hira-analysis.git

Installing dependencies with Poetry

poetry install

Add Poetry environment to Jupyter kernel

poetry run python -m ipykernel install --user --name=qudi-hira-analysis

OR installing dependencies with conda

Creating the conda environment

conda env create -f tools/conda-env-xx.yml

where xx is either win10, osx-intel or osx-apple-silicon.

Activate conda environment

conda activate qudi-hira-analysis

Add conda environment to Jupyter kernel

python -m ipykernel install --user --name=qudi-hira-analysis

Start the analysis

If installed with Poetry

poetry run jupyter lab

OR with conda

jupyter lab

Don't forget to switch to the qudi-hira-analysis kernel in JupyterLab.

Makefile

The Makefile located in notebooks/ is configured to generate a variety of outputs:

  • make pdf : Converts all notebooks to PDF (requires LaTeX backend)
  • make html: Converts all notebooks to HTML
  • make py : Converts all notebooks to Python (can be useful for VCS)
  • make all : Sequentially runs all the notebooks in folder

To use the make command on Windows you can install Chocolatey, then install make with choco install make

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

qudi_hira_analysis-1.3.3.tar.gz (57.0 kB view details)

Uploaded Source

Built Distribution

qudi_hira_analysis-1.3.3-py3-none-any.whl (69.0 kB view details)

Uploaded Python 3

File details

Details for the file qudi_hira_analysis-1.3.3.tar.gz.

File metadata

  • Download URL: qudi_hira_analysis-1.3.3.tar.gz
  • Upload date:
  • Size: 57.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.2 CPython/3.10.8 Windows/10

File hashes

Hashes for qudi_hira_analysis-1.3.3.tar.gz
Algorithm Hash digest
SHA256 035d59ec323e587cbe285ed4539c49b81b0c5b06e9aed6511f327bca2325f55a
MD5 a2479d9d0e11cd9a500e91fddae229c3
BLAKE2b-256 0cce5796b3b54d0c4fd6465310171e6855cce205aba7e53380c4a224da88d4a8

See more details on using hashes here.

File details

Details for the file qudi_hira_analysis-1.3.3-py3-none-any.whl.

File metadata

File hashes

Hashes for qudi_hira_analysis-1.3.3-py3-none-any.whl
Algorithm Hash digest
SHA256 1252e4176f185b9602200670c54e6cbb9cd72b627754262175437fe5ab821473
MD5 6e1b439beebd1ac70feb5eafc0b048e7
BLAKE2b-256 e0e2644583d6fd86e7d601b0e9d3c3daf825bebec98b4acf5d4914891ef13bf1

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page