Skip to main content

Control Layer Interface for parallel microscopy system.

Project description

scope-cli

Control Layer Interface for parallel microscopy system.

Introduction

  1. What is Trappy-Scopes?
  2. How to use it?
  3. Why you should use it?

Installation

  1. Install through conda or pip #todo

    pip install trappyscopes
    conda install trappyscopes
    
    
  2. You can install directly from the source and install the environment using the inbuilt installer that uses pip.

    git clone -r <repo_link>
    cd <scope-cli>
    python main.py --install
    

Configure the Scope

Get to know Trappy-Scope

  1. Basic information about application startup

    python main.py -h
    
  2. A little introduction can be summoned by calling intro().

Start-up and usage

  • Use ./trappyscope, a typical startup would look like:

     ./trappyscope -su UserName experiment_script.py
    
  • Start control layer utility with python -i main.py in the interactive mode.

  • Scripts are an important part of running experimental procedures:

     python main.py <script1> <script2> <script3>
     trappyscope <script1> <script2> <script3>
     python main.py --iterate 3 <script1> ## Run Script1 three times
    
  • The scripts are executed in sequence and can be used to load pre-defined experimental protocols.

  • Alternatively, to load a script/execute a script from the interactive session:

     ScriptEngine.now(globals(), "scriptfile.py")
    

Start an experiment

  • All data-collection should be done within the context of an Experiment:

    exp = Experiment("test")
    

    You should get the following output:

    ────────────────────────────── Experiment open ─────────────────────────────────────────
    [17:21:08] INFO     Loading Experiment: test                                                                                                experiment.py:267
    Working directory changed to: /Users/byatharth/experiments/test
    .
    ├── .experiment
    ├── analysis
    ├── converted
    ├── experiment.yaml
    ├── postprocess
    └── sessions.yaml
    
    3 directories, 3 files
    
    user:ghost || ‹‹M1›› Experiment: test 
    >>>
    

The experiment features are described in the expframework submodule.

Describing one scope

A scope is described as a tree of devices. It is a combination of Processors (ProcessorGroup), Sensors , and Actuators. A scope configuration is defined in the deviceid.yaml configuration file. An example is given below:

name: MDev
uuid: null
type: microscope
frame:
- pico
- topplate
- lit
- diffuser
- lenses.asphere
- sample
- samplestage
- midplate
- zoomlens
- zoomlensholdplate
- camera
- baseplate
optics:
  lenses:
  - 120deg plastic asphere
  - ACL2520U
  - zoomlens
hardware:
  pico:
  - nullpico
  - pico1
  - nullpico
  camera: nullcamera
  illumination: CA_PWM_RGB_LED_5mm
git_sync: false
write_server: ssd1
file_server: smb://files1.igc.gulbenkian.pt/igc/folders/LP/yatharth
auto_fsync: false
auto_pico_fsync: true

Describing N-scopes

Multiple scopes are defined by defining each of the configuration files on each of the scopes. After this is done, the network layer allows the scopes to be connected to the laboratory hive, where all scopes can be accessed on the fly.

How to do Science on the scopes?

An Experiment

The data and metadata collection for any experiment is handled through the Experiment class. It's primary role is to manage storage for every different experiments. Creation of the class, immediately changes the working directory to the experiment one.

Unique ID

Each experiment is also assigned a 10-digit hex unique id. Example: e8423b83d2.

File Structure

Each experiment has the following directory structure:

Experiment_name
		|- .experiment 			        (identifier)
		|- experiment.yaml          (event logs)
		|- data1, data2, data3, ... (data - in the repository)
		|- postprocess              (postprocessed data)
		|- converted                (online conversion - eg. between video formats)
		|- analysis                 (analysis results)

Flow of Control - TODO check with the current version

flowchart
	
	subgraph Setup
		Create-ID-files --> Create-folders --> Copy-Payload
	end
	
	subgraph Loading
		load(load events) --> CWD(Change working directory)
	end
	
	subgraph Deletion
		del -.calls.- close("exp.close()") --> Logs(Write event logs to yaml) --> CWDB(Change working <br>dir to original)
	end
	
	Setup --> Loading --> Deletion

LoadScript utility TODO

Configuration Files

  1. camconfig.yaml : Contains the camera configuration file for the default mode.
  2. deviceid.yaml : Contains the unique identity constants for the device.
  3. common.py : Contains common constants for all devices.

TODO: Obsolete Current Sequence

flowchart LR
	print-flugg.header --> load-device_id --> complete-all-imports --> connect-pico --> Free-REPL

Hardware

The hardware is modelled as a device-tree or a hierarchical collection of devices. All nodes that are not end-nodes are turing complete computational devices.

assembly: 
 | rpi: null
 | cam: camera
 | pico: 
 | | lit: light
 | | beacon: beacon
 | | tandh: t&h sensor
 | 
 *

Hardware firmware

The hardware firmware is synched to the pico device in parts.

Pico Connection and FS Sync:

graph TD
	pico(Pico)
	
	open --mode--> pico
	sync --mode--> pico
	
---
title: "Pico Open Protocol"
---
graph TD
	Open-Pico --> connect-to-board -.-> Sync
	Sync --up--> pico_firmware[[pico_firmware]]
	Sync --up--> lights[[lights]]
	logs[[logs]] --down--> Sync
	

Device ID

Examplar Device ID file:


The default mode for parsing a device ID structure is to first cast each field to a container/collection type and enforce the first value as the unique name and the 2nd value, if present, as a Universal unique identifier.

Experiments

  1. The Experiment class manages the saving of data in specific folders and logs experiement events.
  2. A folder qualifies as an Experiemnt if it contains the .experiment file with the UUID of the experiment.
  3. The file <Experiment_name>.yaml contains the event logs of the experiments.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

trappyscopes-0.1.1.tar.gz (4.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

trappyscopes-0.1.1-py3-none-any.whl (4.5 kB view details)

Uploaded Python 3

File details

Details for the file trappyscopes-0.1.1.tar.gz.

File metadata

  • Download URL: trappyscopes-0.1.1.tar.gz
  • Upload date:
  • Size: 4.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for trappyscopes-0.1.1.tar.gz
Algorithm Hash digest
SHA256 c517c1c0302b90013c35ab8415d5a1a2db71681ad1d3f9ae3fb17e9a9e8e854a
MD5 c11ea0882eb2fc461eb068d98eb4b8db
BLAKE2b-256 3449d54d0e89954beb07101438d1e82854076e32ffaaace25fd94d7693baff3b

See more details on using hashes here.

Provenance

The following attestation bundles were made for trappyscopes-0.1.1.tar.gz:

Publisher: python-publish.yml on Trappy-Scopes/trappyscopes

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file trappyscopes-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: trappyscopes-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 4.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for trappyscopes-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 509cc63826d48bd8e1c0c245c405b84f5a77df440999d27ee80b58d5233ac176
MD5 b9b07b994dd3367e8205d6fcb3ef867c
BLAKE2b-256 aa4f1668946100769217ef141cdc3c17893bf55d6a10e93c58a5e51e649ab716

See more details on using hashes here.

Provenance

The following attestation bundles were made for trappyscopes-0.1.1-py3-none-any.whl:

Publisher: python-publish.yml on Trappy-Scopes/trappyscopes

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page