A Python toolkit to analzye photon timetrace data from qubit sensors
Project description
Qudi Hira Analysis
This toolkit automates a large portion of the work surrounding data analysis on quantum sensing experiments where the primary raw data extracted is photon counts.
The high level interface is abstracted, and provides a set of functions to automate data import, handling and analysis. It is designed to be exposed through Jupyter Notebooks, although the abstract interface allows it to be integrated into larger, more general frameworks as well (with only some pain). Using the toolkit itself should only require a beginner-level understanding of Python.
It also aims to improve transparency and reproducibility in experimental data analysis. In an ideal scenario, two lines of code are sufficient to recreate all output data.
Python offers some very handy features like dataclasses, which are heavily used by this toolkit. Dataclasses offer a full OOP (object-oriented programming) experience while analyzing complex data sets. They provide a solid and transparent structure to the data to reduce errors arising from data fragmentation. This generally comes at a large performance cost, but this is (largely) sidestepped by lazy loading data and storing metadata instead wherever possible.
The visual structure of the toolkit is shown in the schema below. It largely consists of three portions:
IOHandler
assumes a central store of raw data, which is never modified (read-only)DataHandler
automates the extraction of large amounts of data from theIOHandler
interfaceAnalysisLogic
contains a set of automated fitting routines usinglmfit
internally (built on top of fitting routines from the qudi project)
This license of this project is located in the top level folder under LICENSE
. Some specific files contain their
individual licenses in the file header docstring.
Schema
Overall
flowchart TD;
IOHandler<-- Handle all IO operations -->DataLoader;
DataLoader<-- Map IO callables to data -->DataHandler;
DataHandler-- Structure extracted data -->MeasurementDataclass;
MeasurementDataclass-- Plot fitted data --> Plot[Visualize data and add context in JupyterLab];
Plot-- Save plotted data --> DataHandler;
style MeasurementDataclass fill:#bbf,stroke:#f66,stroke-width:2px,color:#fff,stroke-dasharray: 5 5
Measurement Dataclass
flowchart LR;
subgraph Standard Data
MeasurementDataclass-->filepath1[filepath: Path];
MeasurementDataclass-->data1[data: DataFrame];
MeasurementDataclass-->params1[params: dict];
MeasurementDataclass-->timestamp1[timestamp: datetime];
MeasurementDataclass-- analysis --oAnalysisLogic;
end
subgraph Pulsed Data
MeasurementDataclass-- pulsed --oPulsedMeasurementDataclass;
PulsedMeasurementDataclass-- measurement --oPulsedMeasurement;
PulsedMeasurement--> filepath2[filepath: Path];
PulsedMeasurement--> data2[data: DataFrame];
PulsedMeasurement--> params2[params: dict];
PulsedMeasurementDataclass-- laser_pulses --oLaserPulses;
LaserPulses--> filepath3[filepath: Path];
LaserPulses--> data3[data: DataFrame];
LaserPulses--> params3[params: dict];
PulsedMeasurementDataclass-- timetrace --oRawTimetrace;
RawTimetrace--> filepath4[filepath: Path];
RawTimetrace--> data4[data: DataFrame];
RawTimetrace--> params4[params: dict];
end
Supports common fitting routines
Fit routines included in AnalysisLogic
Dimension | Fit |
---|---|
1d | decayexponential |
biexponential | |
decayexponentialstretched | |
gaussian | |
gaussiandouble | |
gaussianlinearoffset | |
hyperbolicsaturation | |
linear | |
lorentzian | |
lorentziandouble | |
lorentziantriple | |
sine | |
sinedouble | |
sinedoublewithexpdecay | |
sinedoublewithtwoexpdecay | |
sineexponentialdecay | |
sinestretchedexponentialdecay | |
sinetriple | |
sinetriplewithexpdecay | |
sinetriplewiththreeexpdecay | |
2d | twoDgaussian |
Inbuilt measurement tree visualizer
>>> tip_2S6 = DataHandler(data_folder="C:\\Data", figure_folder="C:\\QudiHiraAnalysis",
measurement_folder="20220621_FR0612-F2-2S6_uhv")
>>> tip_2S6.data_folder_tree()
# Output
├── 20211116_NetworkAnalysis_SampleIn_UpperPin.csv
├── 20211116_NetworkAnalysis_SampleOut_UpperPin.csv
├── 20211116_NetworkAnalysis_TipIn_LowerPin.csv
├── 20211116_NetworkAnalysis_TipIn_UpperPin.csv
├── 20211116_NetworkAnalysis_TipOut_LowerPin.csv
├── 20211116_NetworkAnalysis_TipOut_UpperPin.csv
├── ContactTestingMeasurementHead
│ ├── C2_Reference.txt
│ ├── C2_SampleLowerPin.txt
│ ├── C2_SampleUpperPin.txt
│ ├── C2_TipLowerPin.txt
│ └── C2_TipUpperPin.txt
├── Sample_MW_Pin_comparision.png
├── Tip_MW_Pin_comparision.png
└── Tip_Sample_MW_Pin_comparision.png
Automated data extraction
Example 1: Extract, fit and plot all Rabi measurements
from pathlib import Path
import seaborn as sns
from qudi_hira_analysis import DataHandler
nv1_handler = DataHandler(data_folder=Path("C:\\Data"), figure_folder=Path("C:\\QudiHiraAnalysis"),
measurement_folder=Path("20230101_NV1"))
rabi_measurements = nv1_handler.load_measurements(measurement_str="rabi", qudi=True, pulsed=True)
for rabi in rabi_measurements:
sns.lineplot(x="Controlled variable(s)", y="Signal", data=rabi.data)
fit_x, fit_y, result = rabi.analysis.fit(x="Controlled variable(s)", y="Signal",
data=rabi.data, fit_function=rabi_measurements.sineexponentialdecay)
sns.lineplot(x=fit_x, y=fit_y)
Example 2: Combine all temperature data, plot and save
from pathlib import Path
import matplotlib.pyplot as plt
import pandas as pd
import seaborn as sns
from qudi_hira_analysis import DataHandler
nv1_handler = DataHandler(data_folder=Path("C:\\Data"), figure_folder=Path("C:\\QudiHiraAnalysis"),
measurement_folder=Path("20230101_NV1"))
temperature_measurements = nv1_handler.load_measurements(measurement_str="temperature-monitoring")
dft = pd.concat([t.data for t in temperature_measurements.values()])
fig, ax = plt.subplots()
sns.lineplot(x="Time", y="Temperature", data=dft, ax=ax)
nv1_handler.save_figures(fig, "temperature-monitoring")
Getting Started
Prerequisites
Latest version of:
Clone the repository
git clone https://github.com/dineshpinto/qudi-hira-analysis.git
Installing dependencies with Poetry
poetry install
Add Poetry environment to Jupyter kernel
poetry run python -m ipykernel install --user --name=qudi-hira-analysis
OR installing dependencies with conda
Creating the conda environment
conda env create -f tools/conda-env-xx.yml
where xx
is either win10
, osx-intel
or osx-apple-silicon
.
Activate conda environment
conda activate qudi-hira-analysis
Add conda environment to Jupyter kernel
python -m ipykernel install --user --name=qudi-hira-analysis
Start the analysis
If installed with Poetry
poetry run jupyter lab
OR with conda
jupyter lab
Don't forget to switch to the qudi-hira-analysis
kernel in JupyterLab.
Citation
If you are publishing scientific results, you can cite this work as: https://doi.org/10.5281/zenodo.7604670
Makefile
The Makefile located in notebooks/
is configured to generate a variety of outputs:
make pdf
: Converts all notebooks to PDF (requires LaTeX backend)make html
: Converts all notebooks to HTMLmake py
: Converts all notebooks to Python (can be useful for VCS)make all
: Sequentially runs all the notebooks in folder
To use the make
command on Windows you can install Chocolatey, then
install make with choco install make
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file qudi_hira_analysis-1.3.2.tar.gz
.
File metadata
- Download URL: qudi_hira_analysis-1.3.2.tar.gz
- Upload date:
- Size: 56.5 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.2 CPython/3.10.8 Windows/10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | df8e0fa1396791f95502637e0bdd26bb55338ced0648232be0ea29deb4e3d849 |
|
MD5 | ae1a596ae675f0293db4b8cd79046207 |
|
BLAKE2b-256 | c05897b0aa3c46ddf933cdec333289593abe7666c95ce5026a70ecbc4a9d59d3 |
File details
Details for the file qudi_hira_analysis-1.3.2-py3-none-any.whl
.
File metadata
- Download URL: qudi_hira_analysis-1.3.2-py3-none-any.whl
- Upload date:
- Size: 68.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.2 CPython/3.10.8 Windows/10
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9beafb58b95d5ba3cbe7134cfcb16d53a5c9e6cf7f87de36f53777bb21666dab |
|
MD5 | f6ee19b6a398f8e7ca3b71775a34f3c3 |
|
BLAKE2b-256 | a9e9bb63aad6e00bb25120e4bb16b02eb5f02cba308e47a0d571fe4992e2adaf |