Python Software for Brain-Computer Interface Development.
Project description
BciPy: Brain-Computer Interface Software in Python
BciPy is a library for conducting Brain-Computer Interface experiments in Python. It is designed to be modular and extensible, allowing researchers to easily add new paradigms, models, and processing methods. The focus of BciPy is on paradigms for communication and control, including Rapid Serial Visual Presentation (RSVP) and Matrix Speller. See our official documentation including affiliations and more context information here.
BciPy is released open-source under the BSD-3 clause. Please refer to LICENSE.md.
If you use BciPy in your research, please cite the following manuscript:
Memmott, T., Koçanaoğulları, A., Lawhead, M., Klee, D., Dudy, S., Fried-Oken, M., & Oken, B. (2021). BciPy: brain–computer interface software in Python. Brain-Computer Interfaces, 1-18.
Table of Contents
- BciPy: Brain-Computer Interface Software in Python
Dependencies
This project requires Python 3.9, 3.10 or 3.11.
It will run on the latest windows (10, 11), linux (ubuntu 22.04) and macos (Sonoma). Other versions may work as well, but are not guaranteed. To see supported versions and operating systems as of this release see our GitHub builds: BciPy Builds. Please see notes below for additional OS specific dependencies before installation can be completed and reference our documentation here: https://bcipy.github.io/hardware-os-config/
Linux
You will need to install the prerequisites defined in scripts\shell\linux_requirements.sh as well as pip install attrdict3.
Windows
If you are using a Windows machine, you will need to install the Microsoft Visual C++ Build Tools.
Mac
If you are using a Mac, you will need to install XCode and enable command line tools. xcode-select --install. If using an m1/2 chip, you may need to use the install script in scripts/shell/m2chip_install.sh to install the prerequisites. You may also need to use the Rosetta terminal to run the install script, but this has not been necessary in our testing using m2 chips.
If using zsh, instead of bash, you may encounter a segmentation fault when running BciPy. This is due to an issue in a dependency of psychopy with no known fix as of yet. Please use bash instead of zsh for now.
Installation
BciPy Setup
In order to run BciPy on your computer, after ensuring the OS dependencies above are met, you can proceed to install the BciPy package.
Editable Install and GUI usage
If wanting to run the GUI or make changes to the code, you will need to install BciPy in editable mode. This will ensure that all dependencies are installed and the package is linked to your local directory. This will allow you to make changes to the code and see them reflected in your local installation without needing to reinstall the package.
- Git clone https://github.com/BciPy/BciPy.git.
- Change directory in your terminal to the repo directory.
- Install BciPy in development mode.
pip install -e .
PyPi Install
If you do not want to run the GUI or make changes to the code, you can install BciPy from PyPi. This will install the package and all dependencies, but will not link it to your local directory. This means that any changes you make to the code will not be reflected in your local installation. This is the recommended installation method if wanting to use the modules without making changes to the BciPy code.
pip install bcipy
Make install
Alternately, if Make is installed, you may run the follow command to install:
# install in development mode with all testing and demo dependencies
make dev-install
Usage
The BciPy package may be used in two ways: via the command line interface (CLI) or via the graphical user interface (GUI). The CLI is useful for running experiments, training models, and visualizing data without needing to run the GUI. The GUI is useful for running experiments, editing parameters and training models with a more user-friendly interface.
Package Usage
To run the package, you will need to import the modules you want to use. For example, to run the the system info module, you can run the following:
from bcipy.helpers import system_utils
system_utils.get_system_info()
GUI Usage
Run the following command in your terminal to start the BciPy GUI:
python bcipy/gui/BCInterface.py
Alternately, if Make is installed, you may run the follow command to start the GUI from the BciPy root directory:
make bci-gui
Client Usage
Once BciPy is installed, it can be used via the command line interface. This is useful for running experiments, training models, and visualizing data without needing to run the GUI.
General Usage
Use the help flag to explore all available options:
bcipy --help
Running Experiments or Tasks via Command Line
You can invoke an experiment protocol or task directly using the bcipy command-line utility. This allows for flexible execution of tasks with various configurations.
Options
# Run with a User ID and Task
bcipy --user "bci_user" --task "RSVP Calibration"
# Run with a User ID and Experiment Protocol
bcipy --user "bci_user" --experiment "default"
# Run with Simulated Data
bcipy --fake
# Run without Visualizations
bcipy --noviz
# Run with Alerts after Task Execution
bcipy --alert
# Run with Custom Parameters
bcipy --parameters "path/to/valid/parameters.json"
These options provide flexibility for running experiments tailored to your specific needs.
Train a Signal Model via Command Line
To train a signal model (e.g., PCARDAKDE or GazeModels), use the bcipy-train command.
Basic Commands Signal Model Training
# Display help information
bcipy-train --help
# Train using data from a specific folder
bcipy-train -d path/to/data
# Display data visualizations (e.g., ERPs)
bcipy-train -v
# Save visualizations to a file without displaying them
bcipy-train -s
# Train with balanced accuracy metrics
bcipy-train --balanced-acc
# Receive alerts after each task execution
bcipy-train --alert
# Use a custom parameters file
bcipy-train -p path/to/parameters.json
Visualize ERP data from a session with Target / Non-Target labels via Command Line
To visualize ERP data from a session with Target / Non-Target labels, use the bcipy-erp-viz command. This command allows you to visualize the data collected during a session and provides options for saving or displaying the visualizations.
Basic Commands ERP Viz
# Display help information
bcipy-erp-viz --help
# Run without a window prompt for a data session folder
bcipy-erp-viz -s path/to/data
# Run with data visualizations (ERPs, etc.)
bcipy-erp-viz --show
# Run with data visualizations that do not show, but save to file
bcipy-erp-viz --save
# Run with custom parameters (default is in bcipy/parameters/parameters.json)
bcipy-erp-viz -p "path/to/valid/parameters.json"
BciPy Simulator
The BciPy simulator allows you to run simulations based on previously collected data. This is useful for testing and validating models and algorithms without needing to collect new data.
Running the Simulator
The simulator can be executed using the bcipy-sim command-line utility.
Basic Commands Simulator
bcipy-sim --help
Other Options
-d: Path to the data folder.-p: Path to the custom parameters file. [optional]-m: Path to the directory of trained model pickle files.-n: Number of iterations to run.
bcipy-sim -d path/to/data -p path/to/parameters.json -m path/to/model.pkl/ -n 5
More comprehensive information can be found in the Simulator Module README.
Core Modules
Top-Level Modules Overview
Each module includes its own README, demo, and tests. Click on the module name to view its README for more information.
Acquisition
Captures data, returns desired time series, and saves to file at the end of a session.
Core
Core data structures and methods essential for BciPy operation.
- Includes triggers, parameters, and raw data handling.
Display
Manages the display of stimuli on the screen and records stimuli timing.
Feedback
Provides feedback mechanisms for sound and visual stimuli.
GUI
End-user interface for registered BCI tasks and parameter editing.
- Key files: BCInterface.py and ParamsForm.
Helpers
Utility functions for interactions between modules and general-purpose tasks.
IO
Handles data file operations such as loading, saving, and format conversion.
- Supported formats: BIDS, BrainVision, EDF, MNE, CSV, JSON, etc.
Language
Provides symbol probability predictions during typing tasks.
Signal
Includes EEG signal models, gaze signal models, filters, processing tools, evaluators, and viewers.
Simulator
Supports running simulations based on previously collected data.
Task
Implements user tasks and actions for BCI experiments.
- Examples: RSVP Calibration, InterTaskAction.
Entry Point and Configuration Modules
main.py
The main executor of experiments and the primary entry point into the application. See the Running Experiments section for more information.
parameters/
Contains JSON configuration files:
parameters.json: Main experiment and application configuration.device.json: Device registry and configuration.experiments.json: Experiment / protocol registry and configuration.phrases.json: Phrase registry and configuration. This can be used to define a list of phrases used in the RSVP and Matrix Speller Copy phrase tasks. If not defined in parameters.json, thetask_textparameter will be used.
config.py
Holds configuration parameters for BciPy, including paths and default data filenames.
static/
Includes resources such as:
- Image and sound stimuli.
- Miscellaneous manuals and readable texts for the GUI.
Paradigms
See the Task README for more information on all supported paradigms, tasks, actions and modes. The major paradigms are listed below.
RSVPKeyboard
RSVP KeyboardTM is an EEG (electroencephalography) based BCI (brain computer interface) typing system. It utilizes a visual presentation technique called rapid serial visual presentation (RSVP). In RSVP, the options are presented rapidly at a single location with a temporal separation. Similarly in RSVP KeyboardTM, the symbols (the letters and additional symbols) are shown at the center of screen. When the subject wants to select a symbol, they await the intended symbol during the presentation and elicit a p300 response to a target symbol.
Orhan, U., Hild, K. E., 2nd, Erdogmus, D., Roark, B., Oken, B., & Fried-Oken, M. (2012). RSVP Keyboard: An EEG Based Typing Interface. Proceedings of the ... IEEE International Conference on Acoustics, Speech, and Signal Processing. ICASSP (Conference), 10.1109/ICASSP.2012.6287966. https://doi.org/10.1109/ICASSP.2012.6287966
Matrix Speller
Matrix Speller is an EEG (electroencephalography) based BCI (brain computer interface) typing system. It utilizes a visual presentation technique called Single Character Presentation (SCP). In matrix speller, the symbols are arranged in a matrix with fixed number of rows and columns. Using SCP, subsets of these symbols are intensified (i.e. highlighted) usually in pseudorandom order to produce an odd ball paradigm to induce p300 responses.
Farwell, L. A., & Donchin, E. (1988). Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalography and clinical Neurophysiology, 70(6), 510-523.
Ahani A, Moghadamfalahi M, Erdogmus D. Language-Model Assisted And Icon-based Communication Through a Brain Computer Interface With Different Presentation Paradigms. IEEE Trans Neural Syst Rehabil Eng. 2018 Jul 25. doi: 10.1109/TNSRE.2018.2859432.
Offset Determination and Correction
[!CAUTION] Static offset determination and correction are critical steps before starting an experiment. BciPy uses LSL to acquire EEG data and Psychopy to present stimuli. The synchronization between the two systems is crucial for accurate data collection and analysis.
What is a Static Offset?
A static offset is the regular time difference between signals and stimuli presentation. This offset is determined through testing using a photodiode or another triggering mechanism. Once determined, the offset is corrected by shifting the EEG signal using the static_offset parameter in devices.json.
How to Determine the Offset
To determine the static offset, you can run a timing verification task (e.g., RSVPTimingVerification) with a photodiode attached to the display and connected to your device. After collecting the data, use the offset module to analyze the results and recommend an offset correction value.
Running Offset Determination
To calculate the offset and display the results, use the following command:
python bcipy/helpers/offset.py -r
This will analyze the data and recommend an offset correction value, which will be displayed in the terminal.
Applying the Offset Correction
Once you have the recommended offset value, you can apply it to verify system stability and display the results. For example, if the recommended offset value is 0.1, run the following command:
python bcipy/helpers/offset.py --offset "0.1" -p
Using Make for Offset Determination
If Make is installed, you can simplify the process by running the following command to determine the offset and display the results:
make offset-recommend
Additional Resources
For more information on synchronization and timing, refer to the following documentation:
Glossary
Stimuli: A single letter, tone or image shown (generally in an inquiry). Singular = stimulus, plural = stimuli.
Trial: A collection of data after a stimuli is shown. A----
Inquiry: The set of stimuli after a fixation cross in a spelling task to gather user intent. A ---- B --- C ----
Series: Each series contains at least one inquiry. A letter/icon decision is made after a series in a spelling task.
Session: Data collected for a task. Comprised of metadata about the task and a list of Series.
Protocol: A collection of tasks and actions to be executed in a session. This is defined for each experiment and can be registered in experiments.json via the BCI GUI.
Experiment: A protocol with a set of parameters. This is defined within experiments and can be registered in experiments.json via the BCI GUI.
Task: An experimental design with stimuli, trials, inquiries and series for use in BCI. For instance, "RSVP Calibration" is a task.
Action: A task without a paradigm. For instance, "RSVP Calibration" is a task, but "InterTaskAction" is an action. These are most often used to define the actions that take place in between tasks.
Mode: Common design elements between task types. For instance, Calibration and Free Spelling are modes.
Paradigm: Display paradigm with unique properties and modes. Ex. Rapid-Serial Visual Presentation (RSVP), Matrix Speller, Steady-State Visual Evoked Potential (SSVEP).
Scientific Publications using BciPy
2025
- Memmott, T., Klee, D., Smedemark-Margulies, N., & Oken, B. (2025). Artifact filtering application to increase online parity in a communication BCI: progress toward use in daily-life. Frontiers in Human Neuroscience, 19, 1551214.
- Peters, B., Celik, B., Gaines, D., Galvin-McLaughlin, D., Imbiriba, T., Kinsella, M., ... & Fried-Oken, M. (2025). RSVP keyboard with inquiry preview: mixed performance and user experience with an adaptive, multimodal typing interface combining EEG and switch input. Journal of neural engineering, 22(1), 016022.
2024
- Klee, D., Memmott, T., & Oken, B. (2024). The Effect of Jittered Stimulus Onset Interval on Electrophysiological Markers of Attention in a Brain–Computer Interface Rapid Serial Visual Presentation Paradigm. Signals, 5(1), 18-39.
- Kocanaogullari, D. (2024). Detection and Assessment of Spatial Neglect Using a Novel Augmented Reality-Guided Eeg-Based Brain-Computer Interface (Doctoral dissertation, University of Pittsburgh).
- Smedemark-Margulies, N. (2024). Reducing Calibration Effort for Brain-Computer Interfaces (Doctoral dissertation, Northeastern University).
2023
- Smedemark-Margulies, N., Celik, B., Imbiriba, T., Kocanaogullari, A., & Erdoğmuş, D. (2023, June). Recursive estimation of user intent from noninvasive electroencephalography using discriminative models. In ICASSP 2023-2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (pp. 1-5). IEEE.
2022
- Mak, J., Kocanaogullari, D., Huang, X., Kersey, J., Shih, M., Grattan, E. S., ... & Akcakaya, M. (2022). Detection of stroke-induced visual neglect and target response prediction using augmented reality and electroencephalography. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 30, 1840-1850.
- Galvin-McLaughlin, D., Klee, D., Memmott, T., Peters, B., Wiedrick, J., Fried-Oken, M., ... & Dudy, S. (2022). Methodology and preliminary data on feasibility of a neurofeedback protocol to improve visual attention to letters in mild Alzheimer's disease. Contemporary Clinical Trials Communications, 28, 100950.
- Klee, D., Memmott, T., Smedemark-Margulies, N., Celik, B., Erdogmus, D., & Oken, B. S. (2022). Target-related alpha attenuation in a brain-computer interface rapid serial visual presentation calibration. Frontiers in Human Neuroscience, 16, 882557.
2021
- Koçanaoğulları, A., Akcakaya, M., & Erdoğmuş, D. (2021). Stopping criterion design for recursive Bayesian classification: analysis and decision geometry. IEEE Transactions on Pattern Analysis and Machine Intelligence, 44(9), 5590-5601.
2020
- Koçanaogullari, A. (2020). Active Recursive Bayesian Classification (Querying and Stopping) for Event Related Potential Driven Brain Computer Interface Systems (Doctoral dissertation, Northeastern University).
- Koçanaoğulları, A., Akçakaya, M., Oken, B., & Erdoğmuş, D. (2020, June). Optimal modality selection using information transfer rate for event related potential driven brain computer interfaces. In Proceedings of the 13th ACM International Conference on PErvasive Technologies Related to Assistive Environments (pp. 1-7).
Contributions Welcome
If you want to be added to the development team Discord or have additional questions, please reach out to us at support@cambi.tech!
Contribution Guidelines
We follow and will enforce the code of conduct outlined here. Please read it before contributing.
Contributors
All contributions are greatly appreciated!
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file bcipy-2.0.1.tar.gz.
File metadata
- Download URL: bcipy-2.0.1.tar.gz
- Upload date:
- Size: 20.6 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2e5597b50feda3f6d0fc92b6cf4d731d180b9f4c2d81b1c86db6176fc56cd29a
|
|
| MD5 |
31fbd77a3ec220e86870281c982a27cd
|
|
| BLAKE2b-256 |
f8cc0fdcb067f86273790580f6691142041f13ceb77a8915370d05beb0da82bf
|
Provenance
The following attestation bundles were made for bcipy-2.0.1.tar.gz:
Publisher:
publish_to_pypi.yml on CAMBI-tech/BciPy
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
bcipy-2.0.1.tar.gz -
Subject digest:
2e5597b50feda3f6d0fc92b6cf4d731d180b9f4c2d81b1c86db6176fc56cd29a - Sigstore transparency entry: 619912680
- Sigstore integration time:
-
Permalink:
CAMBI-tech/BciPy@67df8e17ce49a22eb3d181756a25b4c1de160947 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/CAMBI-tech
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish_to_pypi.yml@67df8e17ce49a22eb3d181756a25b4c1de160947 -
Trigger Event:
push
-
Statement type:
File details
Details for the file bcipy-2.0.1-py2.py3-none-any.whl.
File metadata
- Download URL: bcipy-2.0.1-py2.py3-none-any.whl
- Upload date:
- Size: 20.7 MB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
65531f2c6573625edc7d835aec58c76cfedbe14d05599556267e36d35808a093
|
|
| MD5 |
dc23dc49c9b0cf5127cdf557defee833
|
|
| BLAKE2b-256 |
f195d061a468998ee37bc4653ef67e1e76c0ae410365e86e50f3ea2b2bffba9a
|
Provenance
The following attestation bundles were made for bcipy-2.0.1-py2.py3-none-any.whl:
Publisher:
publish_to_pypi.yml on CAMBI-tech/BciPy
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
bcipy-2.0.1-py2.py3-none-any.whl -
Subject digest:
65531f2c6573625edc7d835aec58c76cfedbe14d05599556267e36d35808a093 - Sigstore transparency entry: 619912733
- Sigstore integration time:
-
Permalink:
CAMBI-tech/BciPy@67df8e17ce49a22eb3d181756a25b4c1de160947 -
Branch / Tag:
refs/heads/main - Owner: https://github.com/CAMBI-tech
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish_to_pypi.yml@67df8e17ce49a22eb3d181756a25b4c1de160947 -
Trigger Event:
push
-
Statement type: