Nimbus BCI: Bayesian classifiers for brain-computer interfaces
Project description
nimbus-bci
Bayesian BCI classifiers with sklearn compatibility, streaming inference, active-learning calibration loops, and rich diagnostics.
Documentation in this repo: Why Nimbus? (vs sklearn / pyRiemann) · Trust, calibration, and rejection · Active-learning calibration loops — cut calibration time with BALD ranking and label-free stopping. Hosted docs: docs.nimbusbci.com.
Features
- Four sklearn-compatible classifiers: three static Bayesian decoders — LDA, QDA, Softmax (Polya–Gamma) — plus NimbusSTS for latent-state / non-stationary settings (EKF-style updates, experimental)
- sklearn-compatible API: Works with pipelines, cross-validation, and GridSearchCV
- Streaming inference: Real-time chunk-by-chunk processing
- Active learning:
suggest_next_trial(BALD on LDA/QDA/Softmax),should_querystreaming gate, and label-freecalibration_sufficientstopping — cut cued calibration time without manual heuristics - Rich diagnostics: Entropy, Mahalanobis distance, calibration metrics (ECE/MCE)
- Online learning: Update models with new data without retraining
- BCI-specific utilities: ITR calculation, temporal aggregation, quality assessment
- MNE-Python integration: Convert between MNE Epochs and Nimbus data formats
Installation
pip install nimbus-bci
To use the optional JAX-based softmax model:
pip install nimbus-bci[softmax]
From source:
git clone https://github.com/nimbusbci/nimbuspysdk.git
cd nimbuspysdk
pip install -e ".[all]"
Quick Start
sklearn-Compatible API (Recommended)
from nimbus_bci import NimbusLDA, NimbusQDA, NimbusSoftmax, NimbusSTS
import numpy as np
# Create and fit classifier
clf = NimbusLDA()
clf.fit(X_train, y_train)
# Predict
predictions = clf.predict(X_test)
probabilities = clf.predict_proba(X_test)
# Online learning
clf.partial_fit(X_new, y_new)
Works with sklearn Pipelines
from sklearn.pipeline import make_pipeline
from sklearn.preprocessing import StandardScaler
from sklearn.model_selection import cross_val_score, GridSearchCV
# Simple pipeline
pipe = make_pipeline(StandardScaler(), NimbusLDA())
pipe.fit(X_train, y_train)
# Cross-validation
scores = cross_val_score(NimbusLDA(), X, y, cv=5)
print(f"Accuracy: {scores.mean():.2%} (+/- {scores.std():.2%})")
# Hyperparameter tuning
param_grid = {'mu_scale': [1.0, 3.0, 5.0], 'class_prior_alpha': [0.5, 1.0]}
grid = GridSearchCV(NimbusLDA(), param_grid, cv=5)
grid.fit(X, y)
print(f"Best params: {grid.best_params_}")
Streaming Inference (Real-Time BCI)
from nimbus_bci import NimbusLDA, StreamingSession
from nimbus_bci.data import BCIMetadata
# Setup
metadata = BCIMetadata(
sampling_rate=250.0,
paradigm="motor_imagery",
feature_type="csp",
n_features=16,
n_classes=4,
chunk_size=125, # 500ms chunks
temporal_aggregation="logvar",
)
# Train model
clf = NimbusLDA()
clf.fit(X_train, y_train)
# Create streaming session
session = StreamingSession(clf.model_, metadata)
# Process chunks in real-time
for chunk in eeg_stream:
result = session.process_chunk(chunk)
print(f"Chunk prediction: {result.prediction} ({result.confidence:.2%})")
# Finalize trial with aggregation
final = session.finalize_trial(method="weighted_vote")
print(f"Final: class {final.prediction} (entropy: {final.entropy:.2f} bits)")
For NimbusSTS specifically (stateful latent dynamics), use StreamingSessionSTS
so the latent state can be propagated and updated with delayed feedback:
from nimbus_bci import NimbusSTS
from nimbus_bci.inference import StreamingSessionSTS
from nimbus_bci.data import BCIMetadata
metadata = BCIMetadata(
sampling_rate=250.0,
paradigm="motor_imagery",
feature_type="csp",
n_features=16,
n_classes=2,
chunk_size=125,
temporal_aggregation="mean",
)
clf = NimbusSTS().fit(X_train, y_train)
session = StreamingSessionSTS(clf, metadata)
result = session.process_chunk(chunk) # propagates state by default
session.provide_feedback(label=0) # when label arrives later
Active Learning (Calibration Loop)
Cut cued-calibration time by labeling only the trials the model is genuinely uncertain about, and stop automatically when the posterior settles:
from nimbus_bci import NimbusLDA
from nimbus_bci.active_learning import (
suggest_next_trial,
calibration_sufficient,
)
clf = NimbusLDA().fit(X_seed, y_seed) # small initial cued batch
prev = clf.get_model()
for _ in range(max_rounds):
# Rank the unlabeled pool by BALD informativeness, label the top 4.
ranked = suggest_next_trial(
clf, X_pool, strategy="bald", n=4, num_posterior_samples=64,
)
X_new, y_new = collect_labels_for(ranked.indices) # cue + record
clf.partial_fit(X_new, y_new)
# Label-free stopping: when predict_proba over the pool stops moving,
# more cues will not change predictions much.
status = calibration_sufficient(
clf, X_pool,
criterion="posterior_stability",
previous=prev, threshold=0.02,
)
if status.is_sufficient:
break
prev = clf.get_model()
Strategies (entropy, margin, least_confidence, bald) and stopping criteria (posterior_stability, expected_info_gain) are all model-agnostic. STS gets posterior_stability for free; BALD-based features on STS are deferred to v1.1. Full recipe in docs/active_learning.md.
Batch Inference with Diagnostics
from nimbus_bci import predict_batch
from nimbus_bci.data import BCIData, BCIMetadata
# Create BCI data container
metadata = BCIMetadata(
sampling_rate=250.0,
paradigm="motor_imagery",
feature_type="csp",
n_features=16,
n_classes=4,
)
data = BCIData(features, metadata, labels)
# Run batch inference with full diagnostics
result = predict_batch(model, data)
print(f"Mean entropy: {result.mean_entropy:.2f} bits")
print(f"Balance: {result.balance:.2%}")
if result.calibration is not None:
print(f"ECE: {result.calibration.ece:.3f}")
print(f"Latency: {result.latency_ms:.1f}ms")
MNE-Python Integration
import mne
from nimbus_bci import NimbusLDA
from nimbus_bci.compat import from_mne_epochs, extract_csp_features
# Load and preprocess with MNE
raw = mne.io.read_raw_gdf("motor_imagery.gdf")
events = mne.find_events(raw)
epochs = mne.Epochs(raw, events, tmin=0, tmax=4, baseline=None, preload=True)
epochs.filter(8, 30) # Mu + Beta bands
# Extract CSP features
csp_features, csp = extract_csp_features(epochs, n_components=8)
# Train Nimbus classifier
clf = NimbusLDA()
clf.fit(csp_features, epochs.events[:, 2])
Available Classifiers
| Classifier | Description | Best For |
|---|---|---|
NimbusLDA |
Bayesian LDA with shared covariance | Fast, when classes have similar shapes |
NimbusQDA |
Bayesian QDA with class-specific covariances | Complex class distributions |
NimbusSoftmax |
Bayesian logistic regression (Polya-Gamma VI) | Non-Gaussian decision boundaries |
NimbusSTS |
Structural time series classifier (latent state + EKF-style inference) | Non-stationary settings, drifting class boundaries (experimental) |
Choosing the Right Classifier
Quick Decision Guide
Is your data stationary (distributions don't change over time)?
- Yes → Use static models (LDA/QDA/Softmax)
- No → Use
NimbusSTSfor temporal adaptation
For stationary data:
- Classes have similar covariance? →
NimbusLDA(fastest) - Classes have different shapes? →
NimbusQDA - Non-Gaussian boundaries? →
NimbusSoftmax
For non-stationary data:
- Gradual drift (fatigue, electrode shift)? →
NimbusSTS - Multi-day sessions with state transfer? →
NimbusSTS - Delayed feedback paradigms? →
NimbusSTS
Detailed Comparison
| Scenario | Recommended Model | Why? |
|---|---|---|
| Stable offline datasets | NimbusLDA |
Fastest, closed-form solution |
| P300 spelling (stable) | NimbusLDA or NimbusQDA |
Event-related, stationary |
| SSVEP | NimbusLDA |
Highly stationary frequency response |
| Motor Imagery (short sessions) | NimbusLDA or NimbusQDA |
Stationary within session |
| Motor Imagery (long sessions, fatigue) | NimbusSTS |
Tracks drift due to fatigue |
| Multi-day experiments | NimbusSTS |
State transfer across sessions |
| Electrode repositioning | NimbusSTS |
Adapts to impedance changes |
| Closed-loop with delayed feedback | NimbusSTS |
Explicit state propagation |
| Asynchronous BCI (idle vs active) | NimbusSTS |
Models engagement state |
| Neurofeedback training | NimbusSTS |
Tracks learning-induced changes |
| Long calibration sessions, want to cut label cost | Any head + suggest_next_trial(strategy="bald") |
Pool-based BALD on the conjugate posterior; LDA/QDA/Softmax in v1 |
| Don't know when to stop calibrating | Any head + calibration_sufficient |
Label-free posterior_stability works for STS too |
NimbusSTS Example (Temporal Adaptation)
from nimbus_bci import NimbusSTS
# Train on calibration data
clf = NimbusSTS(transition_cov=0.05, num_steps=50)
clf.fit(X_calibration, y_calibration)
# Online session with delayed feedback
for x_trial, y_feedback in online_trials:
# 1. Propagate state forward (no label needed)
clf.propagate_state()
# 2. Make prediction
prediction = clf.predict(x_trial)
# ... user performs action, feedback arrives later ...
# 3. Update with true label
clf.partial_fit(x_trial, y_feedback)
# Multi-day state transfer
z_day1, P_day1 = clf.get_latent_state()
# Day 2: Initialize with Day 1 state (increased uncertainty)
clf_day2 = NimbusSTS()
clf_day2.fit(X_day2_calib, y_day2_calib)
clf_day2.set_latent_state(z_day1 * 0.5, P_day1 * 2.0)
Label Conventions (Important)
Nimbus supports common EEG/BCI labeling patterns:
- BCIData labels: can be any non-negative integer codes (e.g., MNE event IDs like 769/770),
as long as the number of unique labels does not exceed
BCIMetadata.n_classes. - sklearn estimators (
NimbusLDA,NimbusQDA,NimbusSoftmax,NimbusSTS):fit()learnsclasses_from your provided labels.predict()returns labels in the original label space (elements ofclasses_).
- Model-snapshot inference (
NimbusModel+predict_batch/StreamingSession):- predictions are returned in the model’s label_base convention (
label_baseis stored inmodel.params). - use
nimbus_bci.data.labels_to_zero_indexed(...)for metrics/aggregation that require 0-indexed labels.
- predictions are returned in the model’s label_base convention (
NimbusSTS Sequence Semantics (Important)
NimbusSTS has a latent state. For correctness and sklearn compatibility:
NimbusSTS.predict_proba(X)treats rows as conditionally independent by default.- For time-ordered evaluation, propagate explicitly:
- call
clf.propagate_state()between trials/chunks, or - use the functional API
nimbus_sts_predict_proba(model, X, evolve_state=True)whenXrows are ordered in time.
- call
Metrics & Diagnostics
from nimbus_bci import (
compute_entropy, # Prediction uncertainty
compute_calibration_metrics, # ECE, MCE
calculate_itr, # Information Transfer Rate
assess_trial_quality, # Quality checks
)
# Entropy (uncertainty)
entropy = compute_entropy(posterior) # bits
# Calibration
calib = compute_calibration_metrics(predictions, confidences, labels)
print(f"ECE: {calib.ece:.3f}, MCE: {calib.mce:.3f}")
# ITR
itr = calculate_itr(accuracy=0.85, n_classes=4, trial_duration=4.0)
print(f"ITR: {itr:.1f} bits/min")
Normalization
Critical for cross-session BCI performance:
from nimbus_bci import estimate_normalization_params, apply_normalization
# Estimate from training data
params = estimate_normalization_params(X_train, method="zscore")
# Apply to all data
X_train_norm = apply_normalization(X_train, params)
X_test_norm = apply_normalization(X_test, params) # Same params!
Project Structure
nimbus_bci/
├── models/ # Classifiers
│ ├── nimbus_lda/ # LDA (shared covariance)
│ ├── nimbus_qda/ # QDA (class-specific covariances)
│ └── nimbus_softmax/ # Softmax (Polya-Gamma)
├── data/ # Data contracts (BCIData, BCIMetadata)
├── inference/ # Batch and streaming inference
├── metrics/ # Diagnostics, calibration, ITR
├── utils/ # Normalization, aggregation
└── compat/ # sklearn/MNE compatibility
Functional API (Backward Compatible)
The original functional API is still available:
from nimbus_bci import (
nimbus_lda_fit, nimbus_lda_predict, nimbus_lda_update,
nimbus_qda_fit, nimbus_qda_predict,
nimbus_softmax_fit, nimbus_softmax_predict,
nimbus_save, nimbus_load,
)
# Fit model
model = nimbus_lda_fit(X, y, n_classes=4, label_base=0, ...)
# Predict
probs = nimbus_lda_predict_proba(model, X_test)
# Update (online learning)
model = nimbus_lda_update(model, X_new, y_new)
# Save/load
nimbus_save(model, "model.npz")
model = nimbus_load("model.npz")
Testing
pip install -e ".[dev]"
pytest -v
Requirements
Core (installed with pip install nimbus-bci):
- Python ≥ 3.11
- NumPy ≥ 1.26
- scikit-learn ≥ 1.4
Optional extras:
- JAX ≥ 0.4.25 — required for
NimbusSoftmaxand the softmax functional API (pip install nimbus-bci[softmax]) - MNE ≥ 1.6 — EEG integration (
pip install nimbus-bci[mne]) - matplotlib ≥ 3.8 — visualization (
pip install nimbus-bci[viz]) - SciPy ≥ 1.12 — included in
pip install nimbus-bci[all]alongside the extras above
License
This software is proprietary and requires a valid license for use.
License Tiers
| Tier | Use Case |
|---|---|
| Evaluation | 30-day free trial for R&D |
| Academic | University research (free) |
| Startup | Companies < $1M revenue |
| Commercial | Full production rights |
| Enterprise | Unlimited deployments + SLA |
| OEM/Embedded | Medical devices, FDA support |
Request Access
To obtain a license:
- Email hello@nimbusbci.com with your use case
- Receive API key and license agreement
- Install and start building
Website: https://nimbusbci.com
© 2024-2026 Nimbus BCI Inc. — The AI Engine for Brain-Computer Interfaces
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distributions
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file nimbus_bci-0.4.1.tar.gz.
File metadata
- Download URL: nimbus_bci-0.4.1.tar.gz
- Upload date:
- Size: 121.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c037caca2ffc91e3674d4ae128e630fc8882eadd463ccee4e1b9835f137d5e15
|
|
| MD5 |
092278d8e1b0a8cf97e286b3c9e5c6ab
|
|
| BLAKE2b-256 |
51c19159e81e70b328d487fe98350da3c0a568bb89341e7b7fb6aa6991c4ba46
|
File details
Details for the file nimbus_bci-0.4.1-cp313-cp313-win_amd64.whl.
File metadata
- Download URL: nimbus_bci-0.4.1-cp313-cp313-win_amd64.whl
- Upload date:
- Size: 469.5 kB
- Tags: CPython 3.13, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c76435989b0ca51e9b1337233e7f9490b90082b9c4d979b0454ea20abcbcdf3e
|
|
| MD5 |
95c6d5aa29f410d5982e4be73bb42f19
|
|
| BLAKE2b-256 |
e60a0818825eb6daec0603984e1f47605a53696595ab49b312993fb9f1cd2338
|
File details
Details for the file nimbus_bci-0.4.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.
File metadata
- Download URL: nimbus_bci-0.4.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
- Upload date:
- Size: 3.1 MB
- Tags: CPython 3.13, manylinux: glibc 2.17+ x86-64, manylinux: glibc 2.28+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ad276f3b20627ef96fdda2ec0710b0ce26e89f6548d3f857adac73f3a2f714ca
|
|
| MD5 |
e00b3467830e8472c17691452d0b38e0
|
|
| BLAKE2b-256 |
830404cc21ea50e9b77a6b726d439c75b080d9f246aa0cad03bea9ef11ddfbb0
|
File details
Details for the file nimbus_bci-0.4.1-cp313-cp313-macosx_11_0_arm64.whl.
File metadata
- Download URL: nimbus_bci-0.4.1-cp313-cp313-macosx_11_0_arm64.whl
- Upload date:
- Size: 522.8 kB
- Tags: CPython 3.13, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2adc51f36e4b604cede0b1ff20fed18453fc5e489e68afa8960bd1ebcb743ee0
|
|
| MD5 |
7a600d828ffd5efee2b89a3151f07d29
|
|
| BLAKE2b-256 |
b01a746c6dcb0e2280bf08442478328caabb0b06080db8e18685e4405906bc4d
|
File details
Details for the file nimbus_bci-0.4.1-cp313-cp313-macosx_10_13_x86_64.whl.
File metadata
- Download URL: nimbus_bci-0.4.1-cp313-cp313-macosx_10_13_x86_64.whl
- Upload date:
- Size: 538.4 kB
- Tags: CPython 3.13, macOS 10.13+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
c06568453a4df1a4845f6abb3dcc43f99a3c6f5e39e95d0629056899b8665b22
|
|
| MD5 |
243e7e3106598b0503017b2edc680782
|
|
| BLAKE2b-256 |
a35c33651cfed0389be2bc6d9719662ed75df84b7778f1592b58d71e9cb922ff
|
File details
Details for the file nimbus_bci-0.4.1-cp312-cp312-win_amd64.whl.
File metadata
- Download URL: nimbus_bci-0.4.1-cp312-cp312-win_amd64.whl
- Upload date:
- Size: 472.7 kB
- Tags: CPython 3.12, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
3e3a7feeaafe435a10c59ec6d36b52d4eb75862225eee06deea7491f43fe4283
|
|
| MD5 |
4a9be40a54bbd720c78b43c4da243716
|
|
| BLAKE2b-256 |
60e26c35b64bf2285d1f82a86f5b3edecf5986037f8b5fea253a17a9dd4c1f6c
|
File details
Details for the file nimbus_bci-0.4.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.
File metadata
- Download URL: nimbus_bci-0.4.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
- Upload date:
- Size: 3.2 MB
- Tags: CPython 3.12, manylinux: glibc 2.17+ x86-64, manylinux: glibc 2.28+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ee305134d304d8cab4b083a10848c006206bfde6b1cfcf6d3b896f4eb2fe93d6
|
|
| MD5 |
5b0436e63e0994d9236a974e8f568067
|
|
| BLAKE2b-256 |
afc59279c1b90df3bd7ae93d92bda5f265fb1e3c62546126ffc5edda2dbddfbe
|
File details
Details for the file nimbus_bci-0.4.1-cp312-cp312-macosx_11_0_arm64.whl.
File metadata
- Download URL: nimbus_bci-0.4.1-cp312-cp312-macosx_11_0_arm64.whl
- Upload date:
- Size: 526.0 kB
- Tags: CPython 3.12, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
452f0abcbafd4ac03dad55c8b8bd948287aa40ef240f9adee31c3571b25b22b3
|
|
| MD5 |
8e266857d23731c093bce08d9d08f942
|
|
| BLAKE2b-256 |
a447d5adfa4ddb03ee708e32d4032b619f0c655b50dd6df790e18e137d188fff
|
File details
Details for the file nimbus_bci-0.4.1-cp312-cp312-macosx_10_13_x86_64.whl.
File metadata
- Download URL: nimbus_bci-0.4.1-cp312-cp312-macosx_10_13_x86_64.whl
- Upload date:
- Size: 541.3 kB
- Tags: CPython 3.12, macOS 10.13+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
36f4b358c431f27c02396352e1da96197957b412f8941673cc78d32133e1f637
|
|
| MD5 |
0ab6b8fc8428c74b1b0e7857de8e465e
|
|
| BLAKE2b-256 |
d1a000d0a5acaa28e94dab33fcf267988bc9dcb739d6ef16afcb55773c8d71ef
|
File details
Details for the file nimbus_bci-0.4.1-cp311-cp311-win_amd64.whl.
File metadata
- Download URL: nimbus_bci-0.4.1-cp311-cp311-win_amd64.whl
- Upload date:
- Size: 497.9 kB
- Tags: CPython 3.11, Windows x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6a65c31ff58ed8cf344f55bec09a628bad87d9967ad2c992aca7d09926093403
|
|
| MD5 |
bfd4b9fc94a40c8ea474a27acedffe6d
|
|
| BLAKE2b-256 |
c8580f1e0d18418e5c3dc07a341207e0f6b149169801e22cc180b1920ceae428
|
File details
Details for the file nimbus_bci-0.4.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl.
File metadata
- Download URL: nimbus_bci-0.4.1-cp311-cp311-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl
- Upload date:
- Size: 3.2 MB
- Tags: CPython 3.11, manylinux: glibc 2.17+ x86-64, manylinux: glibc 2.28+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
06abbe65b8cb64eb262cf8c22bb02d751b64ff70b7b8d4f7c242d0461e87c96e
|
|
| MD5 |
a840beeb22126e8ea49e10f38c6a4a94
|
|
| BLAKE2b-256 |
930f4ec2654439fa980920e4474cdc602037b54842ff5258653638caedf0d235
|
File details
Details for the file nimbus_bci-0.4.1-cp311-cp311-macosx_11_0_arm64.whl.
File metadata
- Download URL: nimbus_bci-0.4.1-cp311-cp311-macosx_11_0_arm64.whl
- Upload date:
- Size: 535.7 kB
- Tags: CPython 3.11, macOS 11.0+ ARM64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2c1aa1bdaae5bb45034336f6e714c10969627242bb312f7d1904a2ca46766313
|
|
| MD5 |
f35234c5718c548fde7df50780bcaf52
|
|
| BLAKE2b-256 |
91edca00c114a773496a0c54aa1e84e93fce2b997e9d2b24a80b7e1a5a0e000c
|
File details
Details for the file nimbus_bci-0.4.1-cp311-cp311-macosx_10_9_x86_64.whl.
File metadata
- Download URL: nimbus_bci-0.4.1-cp311-cp311-macosx_10_9_x86_64.whl
- Upload date:
- Size: 572.1 kB
- Tags: CPython 3.11, macOS 10.9+ x86-64
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.13.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
df2cb239faf74f8dc75e281093966660ddec85fdc4504c08850262d2fd9968de
|
|
| MD5 |
3f98803148af042e5d43a2b7455cb759
|
|
| BLAKE2b-256 |
43cb64180ea7373b1034d2d7771d878890e5436d8fe97b3050b1cf207e233a01
|