Skip to main content

Nimbus BCI: Bayesian classifiers for brain-computer interfaces

Project description

nimbus-bci

Bayesian BCI classifiers with sklearn compatibility, streaming inference, and rich diagnostics.

PyPI Python License

Features

  • Three Bayesian classifiers: LDA, GMM/QDA, and Softmax (Polya-Gamma)
  • Structural Time Series classifier (NimbusSTS): Latent-state dynamics with EKF-style updates (experimental)
  • sklearn-compatible API: Works with pipelines, cross-validation, and GridSearchCV
  • Streaming inference: Real-time chunk-by-chunk processing
  • Rich diagnostics: Entropy, Mahalanobis distance, calibration metrics (ECE/MCE)
  • Online learning: Update models with new data without retraining
  • BCI-specific utilities: ITR calculation, temporal aggregation, quality assessment
  • MNE-Python integration: Convert between MNE Epochs and Nimbus data formats

Installation

pip install nimbus-bci

To use the optional JAX-based softmax model:

pip install nimbus-bci[softmax]

From source:

git clone https://github.com/nimbusbci/nimbuspysdk.git
cd nimbuspysdk
pip install -e ".[all]"

Quick Start

sklearn-Compatible API (Recommended)

from nimbus_bci import NimbusLDA, NimbusGMM, NimbusSoftmax, NimbusSTS
import numpy as np

# Create and fit classifier
clf = NimbusLDA()
clf.fit(X_train, y_train)

# Predict
predictions = clf.predict(X_test)
probabilities = clf.predict_proba(X_test)

# Online learning
clf.partial_fit(X_new, y_new)

Works with sklearn Pipelines

from sklearn.pipeline import make_pipeline
from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import cross_val_score, GridSearchCV

# Simple pipeline
pipe = make_pipeline(StandardScaler(), NimbusLDA())
pipe.fit(X_train, y_train)

# Cross-validation
scores = cross_val_score(NimbusLDA(), X, y, cv=5)
print(f"Accuracy: {scores.mean():.2%} (+/- {scores.std():.2%})")

# Hyperparameter tuning
param_grid = {'mu_scale': [1.0, 3.0, 5.0], 'class_prior_alpha': [0.5, 1.0]}
grid = GridSearchCV(NimbusLDA(), param_grid, cv=5)
grid.fit(X, y)
print(f"Best params: {grid.best_params_}")

Streaming Inference (Real-Time BCI)

from nimbus_bci import NimbusLDA, StreamingSession
from nimbus_bci.data import BCIMetadata

# Setup
metadata = BCIMetadata(
    sampling_rate=250.0,
    paradigm="motor_imagery",
    feature_type="csp",
    n_features=16,
    n_classes=4,
    chunk_size=125,  # 500ms chunks
    temporal_aggregation="logvar",
)

# Train model
clf = NimbusLDA()
clf.fit(X_train, y_train)

# Create streaming session
session = StreamingSession(clf.model_, metadata)

# Process chunks in real-time
for chunk in eeg_stream:
    result = session.process_chunk(chunk)
    print(f"Chunk prediction: {result.prediction} ({result.confidence:.2%})")

# Finalize trial with aggregation
final = session.finalize_trial(method="weighted_vote")
print(f"Final: class {final.prediction} (entropy: {final.entropy:.2f} bits)")

For NimbusSTS specifically (stateful latent dynamics), use StreamingSessionSTS so the latent state can be propagated and updated with delayed feedback:

from nimbus_bci import NimbusSTS
from nimbus_bci.inference import StreamingSessionSTS
from nimbus_bci.data import BCIMetadata

metadata = BCIMetadata(
    sampling_rate=250.0,
    paradigm="motor_imagery",
    feature_type="csp",
    n_features=16,
    n_classes=2,
    chunk_size=125,
    temporal_aggregation="mean",
)

clf = NimbusSTS().fit(X_train, y_train)
session = StreamingSessionSTS(clf, metadata)

result = session.process_chunk(chunk)  # propagates state by default
session.provide_feedback(label=0)      # when label arrives later

Batch Inference with Diagnostics

from nimbus_bci import predict_batch
from nimbus_bci.data import BCIData, BCIMetadata

# Create BCI data container
metadata = BCIMetadata(
    sampling_rate=250.0,
    paradigm="motor_imagery",
    feature_type="csp",
    n_features=16,
    n_classes=4,
)
data = BCIData(features, metadata, labels)

# Run batch inference with full diagnostics
result = predict_batch(model, data)

print(f"Mean entropy: {result.mean_entropy:.2f} bits")
print(f"Balance: {result.balance:.2%}")
print(f"ECE: {result.calibration.ece:.3f}")
print(f"Latency: {result.latency_ms:.1f}ms")

MNE-Python Integration

import mne
from nimbus_bci import NimbusLDA
from nimbus_bci.compat import from_mne_epochs, extract_csp_features

# Load and preprocess with MNE
raw = mne.io.read_raw_gdf("motor_imagery.gdf")
events = mne.find_events(raw)
epochs = mne.Epochs(raw, events, tmin=0, tmax=4, baseline=None, preload=True)
epochs.filter(8, 30)  # Mu + Beta bands

# Extract CSP features
csp_features, csp = extract_csp_features(epochs, n_components=8)

# Train Nimbus classifier
clf = NimbusLDA()
clf.fit(csp_features, epochs.events[:, 2])

Available Classifiers

Classifier Description Best For
NimbusLDA Bayesian LDA with shared covariance Fast, when classes have similar shapes
NimbusGMM Bayesian GMM with class-specific covariances Complex class distributions
NimbusSoftmax Bayesian logistic regression (Polya-Gamma VI) Non-Gaussian decision boundaries
NimbusSTS Structural time series classifier (latent state + EKF-style inference) Non-stationary settings, drifting class boundaries (experimental)

Choosing the Right Classifier

Quick Decision Guide

Is your data stationary (distributions don't change over time)?

  • Yes → Use static models (LDA/GMM/Softmax)
  • No → Use NimbusSTS for temporal adaptation

For stationary data:

  • Classes have similar covariance?NimbusLDA (fastest)
  • Classes have different shapes?NimbusGMM
  • Non-Gaussian boundaries?NimbusSoftmax

For non-stationary data:

  • Gradual drift (fatigue, electrode shift)?NimbusSTS
  • Multi-day sessions with state transfer?NimbusSTS
  • Delayed feedback paradigms?NimbusSTS

Detailed Comparison

Scenario Recommended Model Why?
Stable offline datasets NimbusLDA Fastest, closed-form solution
P300 spelling (stable) NimbusLDA or NimbusGMM Event-related, stationary
SSVEP NimbusLDA Highly stationary frequency response
Motor Imagery (short sessions) NimbusLDA or NimbusGMM Stationary within session
Motor Imagery (long sessions, fatigue) NimbusSTS Tracks drift due to fatigue
Multi-day experiments NimbusSTS State transfer across sessions
Electrode repositioning NimbusSTS Adapts to impedance changes
Closed-loop with delayed feedback NimbusSTS Explicit state propagation
Asynchronous BCI (idle vs active) NimbusSTS Models engagement state
Neurofeedback training NimbusSTS Tracks learning-induced changes

NimbusSTS Example (Temporal Adaptation)

from nimbus_bci import NimbusSTS

# Train on calibration data
clf = NimbusSTS(transition_cov=0.05, num_steps=50)
clf.fit(X_calibration, y_calibration)

# Online session with delayed feedback
for x_trial, y_feedback in online_trials:
    # 1. Propagate state forward (no label needed)
    clf.propagate_state()
    
    # 2. Make prediction
    prediction = clf.predict(x_trial)
    
    # ... user performs action, feedback arrives later ...
    
    # 3. Update with true label
    clf.partial_fit(x_trial, y_feedback)

# Multi-day state transfer
z_day1, P_day1 = clf.get_latent_state()

# Day 2: Initialize with Day 1 state (increased uncertainty)
clf_day2 = NimbusSTS()
clf_day2.fit(X_day2_calib, y_day2_calib)
clf_day2.set_latent_state(z_day1 * 0.5, P_day1 * 2.0)

Label Conventions (Important)

Nimbus supports common EEG/BCI labeling patterns:

  • BCIData labels: can be any non-negative integer codes (e.g., MNE event IDs like 769/770), as long as the number of unique labels does not exceed BCIMetadata.n_classes.
  • sklearn estimators (NimbusLDA, NimbusGMM, NimbusSoftmax, NimbusSTS):
    • fit() learns classes_ from your provided labels.
    • predict() returns labels in the original label space (elements of classes_).
  • Model-snapshot inference (NimbusModel + predict_batch / StreamingSession):
    • predictions are returned in the model’s label_base convention (label_base is stored in model.params).
    • use nimbus_bci.data.labels_to_zero_indexed(...) for metrics/aggregation that require 0-indexed labels.

NimbusSTS Sequence Semantics (Important)

NimbusSTS has a latent state. For correctness and sklearn compatibility:

  • NimbusSTS.predict_proba(X) treats rows as conditionally independent by default.
  • For time-ordered evaluation, propagate explicitly:
    • call clf.propagate_state() between trials/chunks, or
    • use the functional API nimbus_sts_predict_proba(model, X, evolve_state=True) when X rows are ordered in time.

Metrics & Diagnostics

from nimbus_bci import (
    compute_entropy,            # Prediction uncertainty
    compute_calibration_metrics,  # ECE, MCE
    calculate_itr,              # Information Transfer Rate
    assess_trial_quality,       # Quality checks
)

# Entropy (uncertainty)
entropy = compute_entropy(posterior)  # bits

# Calibration
calib = compute_calibration_metrics(predictions, confidences, labels)
print(f"ECE: {calib.ece:.3f}, MCE: {calib.mce:.3f}")

# ITR
itr = calculate_itr(accuracy=0.85, n_classes=4, trial_duration=4.0)
print(f"ITR: {itr:.1f} bits/min")

Normalization

Critical for cross-session BCI performance:

from nimbus_bci import estimate_normalization_params, apply_normalization

# Estimate from training data
params = estimate_normalization_params(X_train, method="zscore")

# Apply to all data
X_train_norm = apply_normalization(X_train, params)
X_test_norm = apply_normalization(X_test, params)  # Same params!

Project Structure

nimbus_bci/
├── models/              # Classifiers
│   ├── nimbus_lda/     # LDA (shared covariance)
│   ├── nimbus_gmm/     # GMM (class-specific covariances)
│   └── nimbus_softmax/ # Softmax (Polya-Gamma)
├── data/               # Data contracts (BCIData, BCIMetadata)
├── inference/          # Batch and streaming inference
├── metrics/            # Diagnostics, calibration, ITR
├── utils/              # Normalization, aggregation
└── compat/             # sklearn/MNE compatibility

Functional API (Backward Compatible)

The original functional API is still available:

from nimbus_bci import (
    nimbus_lda_fit, nimbus_lda_predict, nimbus_lda_update,
    nimbus_gmm_fit, nimbus_gmm_predict,
    nimbus_softmax_fit, nimbus_softmax_predict,
    nimbus_save, nimbus_load,
)

# Fit model
model = nimbus_lda_fit(X, y, n_classes=4, label_base=0, ...)

# Predict
probs = nimbus_lda_predict_proba(model, X_test)

# Update (online learning)
model = nimbus_lda_update(model, X_new, y_new)

# Save/load
nimbus_save(model, "model.npz")
model = nimbus_load("model.npz")

Testing

pip install -e ".[dev]"
pytest -v

Requirements

  • Python ≥ 3.10
  • NumPy ≥ 1.26
  • JAX ≥ 0.4.25
  • NumPyro ≥ 0.14.0
  • scikit-learn ≥ 1.4

Optional:

  • MNE ≥ 1.6 (for EEG integration)
  • matplotlib ≥ 3.8 (for visualization)

License

This software is proprietary and requires a valid license for use.

License Tiers

Tier Use Case
Evaluation 30-day free trial for R&D
Academic University research (free)
Startup Companies < $1M revenue
Commercial Full production rights
Enterprise Unlimited deployments + SLA
OEM/Embedded Medical devices, FDA support

Request Access

To obtain a license:

  1. Email hello@nimbusbci.com with your use case
  2. Receive API key and license agreement
  3. Install and start building

Website: https://nimbusbci.com


© 2024-2025 Nimbus BCI Inc. — The AI Engine for Brain-Computer Interfaces

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

nimbus_bci-0.2.8.tar.gz (95.1 kB view details)

Uploaded Source

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

nimbus_bci-0.2.8-cp312-cp312-win_amd64.whl (442.7 kB view details)

Uploaded CPython 3.12Windows x86-64

nimbus_bci-0.2.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (3.0 MB view details)

Uploaded CPython 3.12manylinux: glibc 2.17+ x86-64

nimbus_bci-0.2.8-cp312-cp312-macosx_11_0_arm64.whl (492.4 kB view details)

Uploaded CPython 3.12macOS 11.0+ ARM64

nimbus_bci-0.2.8-cp312-cp312-macosx_10_9_x86_64.whl (506.5 kB view details)

Uploaded CPython 3.12macOS 10.9+ x86-64

nimbus_bci-0.2.8-cp311-cp311-win_amd64.whl (465.6 kB view details)

Uploaded CPython 3.11Windows x86-64

nimbus_bci-0.2.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.8 MB view details)

Uploaded CPython 3.11manylinux: glibc 2.17+ x86-64

nimbus_bci-0.2.8-cp311-cp311-macosx_11_0_arm64.whl (501.8 kB view details)

Uploaded CPython 3.11macOS 11.0+ ARM64

nimbus_bci-0.2.8-cp311-cp311-macosx_10_9_x86_64.whl (537.0 kB view details)

Uploaded CPython 3.11macOS 10.9+ x86-64

nimbus_bci-0.2.8-cp310-cp310-win_amd64.whl (465.0 kB view details)

Uploaded CPython 3.10Windows x86-64

nimbus_bci-0.2.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.7 MB view details)

Uploaded CPython 3.10manylinux: glibc 2.17+ x86-64

nimbus_bci-0.2.8-cp310-cp310-macosx_11_0_arm64.whl (507.0 kB view details)

Uploaded CPython 3.10macOS 11.0+ ARM64

nimbus_bci-0.2.8-cp310-cp310-macosx_10_9_x86_64.whl (545.0 kB view details)

Uploaded CPython 3.10macOS 10.9+ x86-64

File details

Details for the file nimbus_bci-0.2.8.tar.gz.

File metadata

  • Download URL: nimbus_bci-0.2.8.tar.gz
  • Upload date:
  • Size: 95.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nimbus_bci-0.2.8.tar.gz
Algorithm Hash digest
SHA256 fed6bda1154c97ae75a0e8a03a04edfb92e86d923846d3084e3931e0c50697b6
MD5 926cd1bebf5c34eaada58c8c5e7201e3
BLAKE2b-256 1b966733c9109180202c3ba3308897a8dde1f2e156423403e41bdc6d9d73ff8c

See more details on using hashes here.

File details

Details for the file nimbus_bci-0.2.8-cp312-cp312-win_amd64.whl.

File metadata

  • Download URL: nimbus_bci-0.2.8-cp312-cp312-win_amd64.whl
  • Upload date:
  • Size: 442.7 kB
  • Tags: CPython 3.12, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nimbus_bci-0.2.8-cp312-cp312-win_amd64.whl
Algorithm Hash digest
SHA256 8184f86a39c2f93100e9076a93609e414ddc2d39ee4a7c5feb3d7e6c0cefab9f
MD5 a0a60d8b6b813e0ed0811495273024c3
BLAKE2b-256 a85011df7d55593a84cbd7c1909bd234837875ff3baebfbc7d053204ee1c7c4a

See more details on using hashes here.

File details

Details for the file nimbus_bci-0.2.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for nimbus_bci-0.2.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 77b7a6b1ff29eee73e5c93ddebbd7e9a86d315b158e3731b9e781152a1b70cbb
MD5 0da51d1a3e1d7dbf2f6a969f03e4f2a4
BLAKE2b-256 1e499d39fa364b79c726006df5524611cf84c8393131df947589ad158c2f8694

See more details on using hashes here.

File details

Details for the file nimbus_bci-0.2.8-cp312-cp312-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for nimbus_bci-0.2.8-cp312-cp312-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 01497a755bb7779bba9e6d6843904737d4526694e1f0e6ee63c8593e08b18768
MD5 dd177dbcc2b7f66a603654fd1249e786
BLAKE2b-256 288d7fcd3938222f5d7f2365c811c8b64ef9b92fb1f7c1cf6ea6022ee152e814

See more details on using hashes here.

File details

Details for the file nimbus_bci-0.2.8-cp312-cp312-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for nimbus_bci-0.2.8-cp312-cp312-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 ea9958a34e7e22f22a92e4af529deb57b58f864b43ab6d7ac7a23fe60c09711e
MD5 f600cc937de5b87b3e2f608a47ad2fd5
BLAKE2b-256 8f9b5978b9543c73c36ad60b77c4d3393de35c85f4c207bc05f772cdcd74c99a

See more details on using hashes here.

File details

Details for the file nimbus_bci-0.2.8-cp311-cp311-win_amd64.whl.

File metadata

  • Download URL: nimbus_bci-0.2.8-cp311-cp311-win_amd64.whl
  • Upload date:
  • Size: 465.6 kB
  • Tags: CPython 3.11, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nimbus_bci-0.2.8-cp311-cp311-win_amd64.whl
Algorithm Hash digest
SHA256 11b94771b929f7d9306fda8b17627a732b52b140c929eb28114313aad3c89bce
MD5 9e29fe3660ed459e934dff1f1d62c0a8
BLAKE2b-256 94c6192a1763d884d9300de4fb7ab01cc2af38ec6965b4bff7251c3d3c115c21

See more details on using hashes here.

File details

Details for the file nimbus_bci-0.2.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for nimbus_bci-0.2.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 7459ac15ab57f6062a818b693285596fd9d3315d8abbe41fcfd8355a32858260
MD5 1d28d04a511cac6cb7f0ecf87302c2e0
BLAKE2b-256 91c97092a45c997c1296960d5c5f5a3f33ae184be27865d5933866639a6bae3b

See more details on using hashes here.

File details

Details for the file nimbus_bci-0.2.8-cp311-cp311-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for nimbus_bci-0.2.8-cp311-cp311-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 ee3dd9aa89ea20f1e799ab25eabbfd391f239a81c7eb3421f45f017d58a3d1ab
MD5 7eaa4f99f79655e9c125490ca6a9422a
BLAKE2b-256 35e028ab2087aab50b3a58ee02b498ea4c0d7635ff23cb61cf05907efa8b3660

See more details on using hashes here.

File details

Details for the file nimbus_bci-0.2.8-cp311-cp311-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for nimbus_bci-0.2.8-cp311-cp311-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 0ed266fe28aa09a14c9056ac3f7063d56f940e6ed023697a81c9a475ddea962d
MD5 2951087ec7c2d449ff2ea5665e5adbfa
BLAKE2b-256 a883fb87b7a2db22365a31c8c8e5fe73eaa72e91719ea589613d95bfde1ddaa0

See more details on using hashes here.

File details

Details for the file nimbus_bci-0.2.8-cp310-cp310-win_amd64.whl.

File metadata

  • Download URL: nimbus_bci-0.2.8-cp310-cp310-win_amd64.whl
  • Upload date:
  • Size: 465.0 kB
  • Tags: CPython 3.10, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for nimbus_bci-0.2.8-cp310-cp310-win_amd64.whl
Algorithm Hash digest
SHA256 4c38b8fa4353e058e2f7aa7fe95ef81e15f8e2525892ab193002d66f942e513b
MD5 ada078c8a5d323476591cc3598b6780c
BLAKE2b-256 e94ec775d90cbea22013aefa7792e6bb9f73a8ffa010a559048547b184d7b091

See more details on using hashes here.

File details

Details for the file nimbus_bci-0.2.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.

File metadata

File hashes

Hashes for nimbus_bci-0.2.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 922b7833c0b1110fb82848695da59a30a21a64729cfa63f030112f29b29c941d
MD5 7065f652b111f36d3c3592b66c5b3d12
BLAKE2b-256 206045071a7b184ecb7729862c6f5445a1cd3b3040f0f28b8b14da38d51d4ff4

See more details on using hashes here.

File details

Details for the file nimbus_bci-0.2.8-cp310-cp310-macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for nimbus_bci-0.2.8-cp310-cp310-macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 6d58715731f0949e8efafcc948d84883ddac1ae42c98b8885745c2cd51c1d1a0
MD5 e9e4084e8d772d3a31b2955a55410654
BLAKE2b-256 bafca5283031748db0a14482518329535f54d3380623d2fd758e9d0bd68da13b

See more details on using hashes here.

File details

Details for the file nimbus_bci-0.2.8-cp310-cp310-macosx_10_9_x86_64.whl.

File metadata

File hashes

Hashes for nimbus_bci-0.2.8-cp310-cp310-macosx_10_9_x86_64.whl
Algorithm Hash digest
SHA256 bceee2a26ea16f283db432ff31c10f46ef95da8447fee0ecfa874576ee0e4ec3
MD5 e7b10d68c062e01fe533dad6d0fbafcc
BLAKE2b-256 43190d5e8244c0d3380c9e96155dcd4301927857ba3643fa723f0f2206d6f13c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page