Decomposition of Neurophysiological Time Series Signals with a Particle Swarm Optimised Independence Estimator
Project description
Swarm-Contrastive Decomposition 🧠
A Python package for decomposition of neurophysiological time series signals using a Particle Swarm Optimised Independence Estimator for Blind Source Separation.
Table of Contents 📚
Installation 🛠️
From PyPI (Recommended)
pip install swarm-contrastive-decomposition
From GitHub (Latest Development Version)
pip install git+https://github.com/AgneGris/swarm-contrastive-decomposition.git
From Source
git clone https://github.com/AgneGris/swarm-contrastive-decomposition
cd swarm-contrastive-decomposition
pip install -e .
Verify Installation
python -c "import scd; print(f'SCD version: {scd.__version__}')"
Quick Start 🚀
import scd
# Train with default configuration
dictionary, timestamps = scd.train("data/input/emg.npy")
# Save results
scd.save_results("data/output/emg.pkl", dictionary)
Usage
Basic Usage
import scd
# Use a predefined configuration
dictionary, timestamps = scd.train(
"path/to/your/data.mat",
config_name="surface" # or "default", "intramuscular"
)
scd.save_results("output.pkl", dictionary)
With Configuration Overrides
import scd
# Override specific parameters
dictionary, timestamps = scd.train(
"data/input/emg.npy",
config_name="surface",
max_iterations=100, # override for quick testing
output_final_source_plot=True
)
Step-by-Step Control
import scd
# Load configuration
config = scd.load_config("surface")
# Load data
neural_data = scd.load_data("data/input/emg.npy", device=config.device)
# Preprocess
neural_data = scd.preprocess_data(neural_data, config)
# Train model
dictionary, timestamps = scd.train_model(neural_data, config)
# Save results
scd.save_results("output.pkl", dictionary)
Supported Data Formats
.mat— MATLAB files (specify the variable name withkeyparameter).npy— NumPy arrays
# For .mat files with custom variable name
dictionary, timestamps = scd.train("data.mat", key="emg_data")
# For .npy files
dictionary, timestamps = scd.train("data.npy")
Data should have shape (time, channels) or (channels, time) — the loader will automatically transpose if needed.
Configuration ⚙️
Configurations are defined in scd/configs.json. Available presets:
| Config Name | Use Case | Sampling Rate | Description |
|---|---|---|---|
default |
General purpose | 10240 Hz | Balanced settings for most EMG data |
surface |
Surface EMG | 10240 Hz | Optimized for surface recordings |
intramuscular |
Intramuscular EMG | 10240 Hz | Higher iterations for fine-wire recordings |
Configuration Parameters
| Parameter | Description | Default |
|---|---|---|
device |
"cuda" for GPU or "cpu" |
"cuda" |
acceptance_silhouette |
Quality threshold for source acceptance | 0.85 |
extension_factor |
Typically 1000 / num_channels. Higher values may improve results |
25 |
low_pass_cutoff |
Low-pass filter cutoff frequency (Hz) | 4400 |
high_pass_cutoff |
High-pass filter cutoff frequency (Hz) | 10 |
sampling_frequency |
Sampling frequency of your signal (Hz) | 10240 |
start_time |
Start time for signal trimming (s). Use 0 for beginning |
0 |
end_time |
End time for signal trimming (s). Use -1 for entire signal |
-1 |
max_iterations |
Maximum decomposition iterations | 200 |
peel_off_window_size_ms |
Window size for spike-triggered average (ms) | 20 |
output_final_source_plot |
Generate plot of final sources | false |
use_coeff_var_fitness |
Use coefficient of variation fitness. true for EMG, false for intracortical |
true |
remove_bad_fr |
Filter sources with firing rates < 2 Hz or > 100 Hz | true |
clamp_percentile |
Percentile for amplitude clamping | 0.999 |
Custom Configuration
Add your own configuration to scd/configs.json:
{
"my_experiment": {
"device": "cuda",
"acceptance_silhouette": 0.80,
"extension_factor": 30,
"sampling_frequency": 2048,
...
}
}
Then use it:
dictionary, timestamps = scd.train("data.mat", config_name="my_experiment")
Test Data 🧪
The repository includes test data to verify your installation:
- File:
data/input/emg.npy - Type: Surface EMG
- Sampling rate: 10240 Hz
- Configuration: Use
"surface"config
import scd
# Run with test data
dictionary, timestamps = scd.train(
"data/input/emg.npy",
config_name="surface"
)
print(f"Found {len(dictionary)} motor units")
Contributing 🤝
We welcome contributions! Here's how you can contribute:
- Fork the repository
- Create a feature branch (
git checkout -b feature/newfeature) - Commit your changes (
git commit -m 'Add some newfeature') - Push to the branch (
git push origin feature/newfeature) - Open a pull request
License 📜
This project is licensed under the CC BY-NC 4.0 License.
Citation
If you use this code in your research, please cite our paper:
@article{grison2024particle,
author={Grison, Agnese and Clarke, Alexander Kenneth and Muceli, Silvia and Ibáñez, Jaime and Kundu, Aritra and Farina, Dario},
journal={IEEE Transactions on Biomedical Engineering},
title={A Particle Swarm Optimised Independence Estimator for Blind Source Separation of Neurophysiological Time Series},
year={2024},
volume={},
number={},
pages={1-11},
doi={10.1109/TBME.2024.3446806},
keywords={Recording; Time series analysis; Sorting; Vectors; Measurement; Electrodes; Probes; Independent component analysis; particle swarm optimisation; blind source separation; intramuscular electromyography; intracortical recording}
}
Contact
For questions or inquiries:
Agnese Grison
📧 agnese.grison16@imperial.ac.uk
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file swarm_contrastive_decomposition-0.1.4.tar.gz.
File metadata
- Download URL: swarm_contrastive_decomposition-0.1.4.tar.gz
- Upload date:
- Size: 23.1 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
fceda8634fc1c23d75df9aac757677acaae0e5bf0dd29bdec4b471a33d77cc21
|
|
| MD5 |
7912c76fa5c280d06da5fd059b61ca63
|
|
| BLAKE2b-256 |
d69bb2eb00adbea023377a74f6bf3ab010d818766001f3bd98f4da1463f7a46f
|
File details
Details for the file swarm_contrastive_decomposition-0.1.4-py3-none-any.whl.
File metadata
- Download URL: swarm_contrastive_decomposition-0.1.4-py3-none-any.whl
- Upload date:
- Size: 23.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.13
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
bda74c766c259f6c5c21a7b82efd72f82075ff3221adc9a66d68386dc6ed06b1
|
|
| MD5 |
36eb533f315cb108ea2563a6688272f0
|
|
| BLAKE2b-256 |
a679aca6b52acbf69b86d47526152d5ff21fb2ba3e0dde71ab897875e6c9260d
|