Skip to main content

A library to read, emulate, and forward Micromed data in standard formats

Project description

Tests Doc Codecov

A library to read, emulate, and forward Micromed data in standard formats. See online doc.

Main features:

  • simulate online data from a trc file

  • push online tcp data to LSL server

  • convert trc to mne format

  • rename trc files to include the recording datetime

Install

$ pip install micromed-io

Convert a Micromed (.trc) file to MNE (.fif) format

from micromed_io.to_mne import create_mne_from_micromed_recording
mne_raw = create_mne_from_micromed_recording("path/to/file.TRC")

Emulate TRC to TCP & read/forward to LSL server

See details in next sections

StreamPlayer

Emulate Online Micromed TCP from .trc file

$ mmio_emulate_trc --file=../data/sample.TRC --address=localhost --port=5123

Emulate the online data stream of Micromed to test your real-time platform. See all the arguments and adapt them:

$ mmio_emulate_trc --help # to see all arguments

Read TCP and push to LSL Stream

$ mmio_tcp_to_lsl --address=localhost --port=5123

While receiving online data throug tcp, this command forward the data to 3 LSL stream outlets:

  • Micromed_EEG: the eeg data in float32 format [n_channels, n_samples]

  • Micromed_Markers: markers if any in int32 format [sample, marker] (2 channels)

  • Micromed_Notes: notes if any in string format [sample, note] (2 channels)

You can easily change the LSL parameters:

$ mmio_tcp_to_lsl --help # to see all arguments

Read TRC file

from micromed_io.trc import MicromedTRC
mmtrc = MicromedTRC("sample.TRC")

Then you have access to the trc data:

mmtrc.get_header()
mmtrc.get_markers()
mmtrc.get_data()
mmtrc.get_notes()

Read and parse Micromed TCP live data

Download tcp_to_lsl.py from the github repo in scripts/

$ python tcp_to_lsl.py --address=localhost --port=5123

Note: Micromed TCP behaves as a client. If you want to try the emulate/read TCP script, launch the reader first that acts as server, then the emulator.

Rename TRC files with recording datetime

$ mmio_rename_trc --dirpath=./ --format=%Y%m%d-%H%M%S

Rename the TRC files of the given folder to include the recording date in the filename. Output is : <filename>__<recording_date>.TRC. The format must be compliant with python strftime format codes

mmio_rename_trc --help # to see help

Local install

Download the repo and:

$ conda env create -f environment.yml
$ conda activate mmio
$ poetry install

Please feel free to reach me if you want to contribute.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

micromed_io-0.4.5.tar.gz (17.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

micromed_io-0.4.5-py3-none-any.whl (21.7 kB view details)

Uploaded Python 3

File details

Details for the file micromed_io-0.4.5.tar.gz.

File metadata

  • Download URL: micromed_io-0.4.5.tar.gz
  • Upload date:
  • Size: 17.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.11.5 Windows/10

File hashes

Hashes for micromed_io-0.4.5.tar.gz
Algorithm Hash digest
SHA256 c8a9935456c24c2dece29006ff841dfc47fb45f2109dfdbad11c43ca6f1c4762
MD5 417dffe2c9cb440192892c47703201e4
BLAKE2b-256 b8d41c30f5368d475bedce3f8312659ed666b13d0dc70b80d18add544aa23f7a

See more details on using hashes here.

File details

Details for the file micromed_io-0.4.5-py3-none-any.whl.

File metadata

  • Download URL: micromed_io-0.4.5-py3-none-any.whl
  • Upload date:
  • Size: 21.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.2 CPython/3.11.5 Windows/10

File hashes

Hashes for micromed_io-0.4.5-py3-none-any.whl
Algorithm Hash digest
SHA256 38961decde113d594af6fdbb666248cfd24ab6c54101a3c4d9c4ecaf7c6988ae
MD5 3f51f03593ec6198478812f46a549ed3
BLAKE2b-256 8ccb9f09785e3f10afe1cbfecb597f8dda2e2993d8f32b2597f3859c11870445

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page