The MindAffect BCI python SDK
This repository contains the python SDK code for the Brain Computer Interface (BCI) developed by the company Mindaffect.
Online Documentation and Tutorials
Available at: https://mindaffect-bci.readthedocs.io/
- To install from source (currently the recommended method):
Clone or download this repository:
git clone https://github.com/mindaffect/pymindaffectBCI
Install the necessary bits to your local python path:
change to the directory where you cloned the repository.
Add this module to the python path, and install dependencies:
pip install -e .
To install as a python library:
pip install --upgrade mindaffectBCI
You can run a quick test if the installation without any additional hardware by running:
python3 -m mindaffectBCI.online_bci --acquisition fakedata
Essentially, this run the SDK test code which simulates a fake EEG source and then runs the full BCI sequence, with decoder discovery, calibration and prediction.
If all is successfully installed then you should see a window like this open up.
<img src=’docs/images/mainmenu.png’ width=300>
If you now press 2 you should see a flickering grid of “buttons” like below. You should see a random one briefly flash green (it’s the target) then rapidly flicker and eventually turn blue (to indicate it’s selected.)
<img src=’docs/images/selectionMatrix.png’ width=300>
If all this works then you have successfully installed the mindaffectBCI python software. You should now ensure your hardware (display, amplifier) is correctly configured before jumping into BCI control.
Important: FrameRate Check
For rapid visual stimulation BCI (like the noisetagging BCI), it is very important that the visual flicker be displayed accurately. However, as the graphics performance of computers varies widely it is hard to know in advance if a particular configuration is accurate enough. To help with this we also provide a graphics performance checker, which will validate that your graphics system is correctly configured. You can run this with:
python3 -m mindaffectBCI.examples.presentation.framerate_check
As this runs it will show in a window your current graphics frame-rate and, more importantly, the variability in the frame times. For good BCI performance this jitter should be <1ms. If you see jitter greater than this you should probably adjust your graphics card settings. The most important setting to consider is to be sure that you have _vsync_ <https://en.wikipedia.org/wiki/Screen_tearing#Vertical_synchronization> turned-on. Many graphics cards turn this off by default, as it (in theory) gives higher frame rates for gaming. However, for our system, frame-rate is less important than exact timing, hence always turn vsync on for visual Brain-Compuber-Interfaces!
Brain Computer Interface Test
- If you have:
Then you can jump directly to trying a fully functional simple letter matrix BCI using:
python3 -m mindaffectBCI.online_bci
If you run into and issue you can either directly raise an issue on the projects github page
This repository is organized roughly as follows:
- mindaffectBCI - contains the python package containing the mindaffectBCI SDK. Important modules within this package are: - noisetag.py - This module contains the main API for developing User Interfaces with BCI control - utopiaController.py - This module contains the application level APIs for interacting with the MindAffect Decoder. - utopiaclient.py - This module contains the low-level networking functions for communicating with the MindAffect Decoder - which is normally a separate computer running the eeg analysis software. - stimseq.py – This module contains the low-level functions for loading and codebooks - which define how the presented stimuli will look.
- decoder - contains our open source python based Brain Computer Interface decoder, for both on-line and off-line analysis of neuro-imaging data. Important modules within this package are: - decoder.py - This module contains the code for the on-line decoder. - offline_analysis.ipynb - This juypter notebook contains to run an off-line analysis of previously saved data from the mindaffectBCI or other publically available BCI datasets.
- examples - contains python based examples for Presentation and Output parts of the BCI. Important sub-directories
- output - Example output modules. An output module translates BCI based selections into actions.
- presentation - Example presentation modules. A presentation module, presents the BCI stimulus to the user, and is normally the main UI. In particular here we have: - framerate_check.py - Which you can run to test if your display settings (particularly vsync) are correct for accurate flicker presentation. - selectionMatrix.py - Which you can run as a simple example of using the mindaffectBCI to select letters from an on-screen grid.
- utilities - Useful utilities, such as a simple raw signal viewer
- acquisition - Example data acquisition modules. An acquisition module interfaces with the EEG measurment hardware and streams time-stamped data to the hub.
- docs – contains the documentation.
Release history Release notifications | RSS feed
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
|Filename, size||File type||Python version||Upload date||Hashes|
|Filename, size mindaffectBCI-0.9.24-py3-none-any.whl (637.4 kB)||File type Wheel||Python version py3||Upload date||Hashes View|
|Filename, size mindaffectBCI-0.9.24.tar.gz (397.3 kB)||File type Source||Python version None||Upload date||Hashes View|
Hashes for mindaffectBCI-0.9.24-py3-none-any.whl