On-device emotion inference from biosignals (heart rate and RR intervals)
Project description
Synheart Emotion - Python SDK
On-device emotion inference from biosignals (heart rate and RR intervals) for Python applications.
Features
- Privacy-first: All processing happens on-device
- Real-time: <10ms inference latency (ONNX models)
- Binary emotion states: Baseline, Stress
- Sliding window: 120s window with 60s step (default, configurable)
- 14 HRV Features: Comprehensive feature extraction (time-domain, frequency-domain, non-linear)
- ONNX Support: ExtraTrees models optimized for on-device inference
- Python 3.8+: Modern Python with type hints
- Thread-safe: Concurrent data ingestion supported
- HSI Compatible: Designed for Human State Interface integration
Installation
From PyPI (recommended)
pip install synheart-emotion
From source
git clone https://github.com/synheart-ai/synheart-emotion-python.git
cd synheart-emotion-python
pip install -e .
With optional ML dependencies
For advanced model loading (scikit-learn, XGBoost):
pip install synheart-emotion[ml]
Development installation
pip install synheart-emotion[dev]
Verify Installation
# Quick verification
python -c "from synheart_emotion import EmotionEngine, EmotionConfig; print('✓ Installation successful')"
# Run tests
pytest tests/
# Run examples
python examples/basic_usage.py
python examples/cli_demo.py --samples 15
Building from Source
# Install build tools
pip install build twine
# Build package
python -m build
# This creates:
# - dist/synheart_emotion-0.0.1.tar.gz (source distribution)
# - dist/synheart_emotion-0.0.1-py3-none-any.whl (wheel)
Troubleshooting
Import Error: Make sure the package is installed with pip list | grep synheart-emotion
Version Conflicts: Upgrade dependencies with pip install --upgrade numpy pandas scipy onnxruntime
Missing Dependencies: Install all requirements with pip install -r requirements.txt
ONNX Runtime Issues: Ensure onnxruntime is installed: pip install onnxruntime>=1.15.0
Quick Start
from datetime import datetime
from synheart_emotion import EmotionConfig, EmotionEngine
# Create engine with default configuration (120s window, 60s step)
config = EmotionConfig()
engine = EmotionEngine(config)
# Push data from wearable
engine.push(
hr=72.0,
rr_intervals_ms=[850.0, 820.0, 830.0, 845.0, 825.0],
timestamp=datetime.now()
)
# Get inference result when ready
results = engine.consume_ready()
for result in results:
print(f"Emotion: {result.emotion}")
print(f"Confidence: {result.confidence:.1%}")
print(f"Probabilities: {result.probabilities}")
Examples
Basic Usage
from datetime import datetime
from synheart_emotion import EmotionConfig, EmotionEngine
# Initialize engine
config = EmotionConfig()
engine = EmotionEngine(config)
# Simulate wearable data stream
hr_data = [72.0, 73.5, 71.8, 74.2, 72.5]
rr_data = [
[850.0, 820.0, 830.0, 845.0, 825.0],
[855.0, 815.0, 835.0, 840.0, 830.0],
# ... more data
]
# Push data
for hr, rr_intervals in zip(hr_data, rr_data):
engine.push(
hr=hr,
rr_intervals_ms=rr_intervals,
timestamp=datetime.now()
)
# Consume results
results = engine.consume_ready()
if results:
result = results[0]
print(f"Emotion: {result.emotion} ({result.confidence:.1%})")
See the examples/ directory for more comprehensive examples:
basic_usage.py- Simple emotion inferencecustom_config.py- Custom configuration and loggingstreaming_data.py- Continuous data stream simulation
Custom Configuration
config = EmotionConfig(
model_id="ExtraTrees_120_60_nozipmap", # ExtraTrees model
window_seconds=120.0, # 120 second window (default)
step_seconds=60.0, # 60 second step (default)
min_rr_count=30, # Minimum RR intervals
)
Logging
def custom_logger(level, message):
print(f"[{level}] {message}")
engine = EmotionEngine(
config=config,
on_log=custom_logger
)
Buffer Management
# Get buffer statistics
stats = engine.get_buffer_stats()
print(f"Data points: {stats['count']}")
print(f"Duration: {stats['duration_ms']}ms")
print(f"HR range: {stats['hr_range']}")
print(f"RR count: {stats['rr_count']}")
# Clear buffer
engine.clear()
API Reference
EmotionConfig
Configuration for the emotion inference engine.
@dataclass
class EmotionConfig:
model_id: str = "ExtraTrees_120_60_nozipmap"
window_seconds: float = 120.0
step_seconds: float = 60.0
min_rr_count: int = 30
Attributes:
model_id- Model identifier (default: ExtraTrees_120_60_nozipmap)window_seconds- Rolling window size (default: 120s)step_seconds- Emission cadence (default: 60s)min_rr_count- Minimum RR intervals required (default: 30)
EmotionEngine
Main emotion inference engine.
Class Methods:
def __init__(
config: EmotionConfig,
on_log: Optional[Callable[[str, str], None]] = None
) -> EmotionEngine
Create engine. The model is automatically loaded based on config.model_id.
Instance Methods:
def push(
hr: float,
rr_intervals_ms: List[float],
timestamp: datetime,
motion: Optional[Dict[str, float]] = None
) -> None
Push new data point into the engine.
def consume_ready() -> List[EmotionResult]
Consume ready results (throttled by step interval).
def get_buffer_stats() -> Dict[str, Any]
Get current buffer statistics.
def clear() -> None
Clear all buffered data.
EmotionResult
Result of emotion inference (dictionary).
{
"timestamp": datetime,
"emotion": str, # Top-1 predicted label (Baseline or Stress)
"confidence": float, # Confidence score (0.0-1.0)
"probabilities": Dict[str, float], # All label probabilities
"features": Dict[str, float] # Extracted 14 HRV features
}
Running Examples
# Basic usage
python examples/basic_usage.py
# Custom configuration
python examples/custom_config.py
# Streaming data simulation
python examples/streaming_data.py
Requirements
- Python 3.8+
- NumPy >= 1.21.0
- Pandas >= 1.3.0
- SciPy >= 1.7.0 (for frequency-domain HRV features)
- onnxruntime >= 1.15.0 (for ONNX model inference)
Optional (for ML model loading):
- scikit-learn >= 1.0.0
- joblib >= 1.1.0
- xgboost >= 1.5.0
Architecture
The package follows a modular architecture:
synheart_emotion/
├── __init__.py # Package exports
├── synheart_emotion.py # Single-file implementation (config, engine, features, ONNX)
└── data/ # ONNX model files and metadata
Data Flow
- Push - Biosignal data (HR, RR intervals) pushed to engine
- Buffer - Data stored in sliding window ring buffer
- Window Check - Engine verifies window is full (oldest data >= window_seconds)
- Extract - 14 HRV features extracted from window data (time-domain, frequency-domain, non-linear)
- Infer - ONNX model predicts emotion probabilities
- Emit - Results emitted at configured step intervals
Thread Safety
The engine uses threading.RLock() for thread-safe operations:
- Multiple threads can push data concurrently
- Buffer operations are protected
- Results can be consumed from any thread
Model Architecture
The library uses ExtraTrees (Extremely Randomized Trees) classifiers trained on the WESAD dataset:
- 14 HRV Features: Time-domain, frequency-domain, and non-linear metrics
- Binary Classification: Baseline vs Stress detection
- ONNX Format: Optimized for on-device inference using ONNX Runtime
- Accuracy: ~78% on WESAD validation set
Available Models
Models are automatically loaded based on config.model_id:
extratrees_w120s60_binary_v1_0orExtraTrees_120_60_nozipmap: 120-second window, 60-second step (default)extratrees_w60s5_binary_v1_0orExtraTrees_60_5_nozipmap: 60-second window, 5-second stepextratrees_w120s5_binary_v1_0orExtraTrees_120_5_nozipmap: 120-second window, 5-second step
All models use binary classification: Baseline vs Stress.
Feature Extraction
The library extracts 14 HRV features in the following order:
Time-domain features:
- RMSSD (Root Mean Square of Successive Differences)
- Mean_RR (Mean RR interval)
- HRV_SDNN (Standard Deviation of NN intervals)
- pNN50 (Percentage of successive differences > 50ms)
Frequency-domain features:
- HRV_HF (High Frequency power)
- HRV_LF (Low Frequency power)
- HRV_HF_nu (Normalized HF)
- HRV_LF_nu (Normalized LF)
- HRV_LFHF (LF/HF ratio)
- HRV_TP (Total Power)
Non-linear features:
- HRV_SD1SD2 (Poincaré plot ratio)
- HRV_Sampen (Sample Entropy)
- HRV_DFA_alpha1 (Detrended Fluctuation Analysis)
Heart Rate:
- HR (Heart Rate in BPM)
Privacy & Security
- On-Device Processing: All emotion inference happens locally
- No Data Retention: Raw biometric data is not retained after processing
- No Network Calls: No data is sent to external servers
- Privacy-First Design: No built-in storage - you control what gets persisted
- Real Trained Models: Uses WESAD-trained ExtraTrees models with ~78% accuracy
- 14-Feature Extraction: Comprehensive HRV analysis including time-domain, frequency-domain, and non-linear metrics
Development
Running Tests
pytest tests/
Code Formatting
black src/ examples/ tests/
isort src/ examples/ tests/
Type Checking
mypy src/
Integration
With synheart-core (HSI)
synheart_emotion is designed to integrate seamlessly with synheart-core as part of the Human State Interface (HSI) system:
from synheart_core import Synheart, SynheartConfig
from synheart_emotion import EmotionEngine, EmotionConfig
# Initialize synheart-core (includes emotion capability)
synheart = Synheart.initialize(
user_id="user_123",
config=SynheartConfig(
enable_wear=True,
enable_behavior=True,
),
)
# Enable emotion interpretation layer (powered by synheart-emotion)
synheart.enable_emotion()
# Get emotion updates through HSI
@synheart.on_emotion_update
def handle_emotion(emotion):
print(f"Baseline: {emotion.baseline}")
print(f"Stress: {emotion.stress}")
HSI Schema Compatibility:
- EmotionResult from synheart-emotion maps to HSI EmotionState
- Output validated against HSI_SPECIFICATION.md
- Comprehensive integration tests ensure compatibility
See the synheart-core documentation for more details on HSI integration.
Performance
Target Performance:
- Latency: < 10ms per inference (ONNX models)
- Model Size: ~200-300 KB per model
- CPU Usage: < 3% during active streaming
- Memory: < 5 MB (engine + buffers + ONNX runtime)
- Accuracy: ~78% on WESAD dataset (binary classification: Baseline vs Stress)
Benchmarks:
- 14-feature extraction: < 3ms
- ONNX model inference: < 5ms
- Full pipeline: < 10ms
License
See LICENSE file for details.
Contributing
Contributions are welcome! See our Contributing Guidelines for details.
🔗 Links
- Main Repository: synheart-emotion (Source of Truth)
- Documentation: RFC E1.1
- Model Card: Model Card
- Examples: Examples
- Models: Pre-trained Models
- Tools: Development Tools
- Synheart AI: synheart.ai
- Issues: GitHub Issues
Citation
If you use this package in your research, please cite:
@software{synheart_emotion,
title = {Synheart Emotion: On-device emotion inference from biosignals},
author = {Goytom, Israel},
year = {2025},
version = {0.0.1},
url = {https://github.com/synheart-ai/synheart-emotion}
}
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file synheart_emotion-0.1.0.tar.gz.
File metadata
- Download URL: synheart_emotion-0.1.0.tar.gz
- Upload date:
- Size: 1.8 MB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
561b93393deeea01480d467cef5805e6c8b536983157c7545f9d9aeaf8ab6f5d
|
|
| MD5 |
e5b5555148b7192340a0b7a31446f64a
|
|
| BLAKE2b-256 |
eb6b63f670770469b854150200c8f00f8ad219326248d7a83d7221838d83214a
|
Provenance
The following attestation bundles were made for synheart_emotion-0.1.0.tar.gz:
Publisher:
publish.yml on synheart-ai/synheart-emotion-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
synheart_emotion-0.1.0.tar.gz -
Subject digest:
561b93393deeea01480d467cef5805e6c8b536983157c7545f9d9aeaf8ab6f5d - Sigstore transparency entry: 787528178
- Sigstore integration time:
-
Permalink:
synheart-ai/synheart-emotion-python@f6ba35fdf44752501c3b25be383376c254250341 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/synheart-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@f6ba35fdf44752501c3b25be383376c254250341 -
Trigger Event:
release
-
Statement type:
File details
Details for the file synheart_emotion-0.1.0-py3-none-any.whl.
File metadata
- Download URL: synheart_emotion-0.1.0-py3-none-any.whl
- Upload date:
- Size: 1.8 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
46d5fc03f350e937bcdfd53f0f1abef284384b89f9e053347765f744ecb7635a
|
|
| MD5 |
82d6784733e6c621fb13b46adf46fbdb
|
|
| BLAKE2b-256 |
06b7883cc6067e345129dfa62aa9e0c742ee74446281413fc4205b2810143bff
|
Provenance
The following attestation bundles were made for synheart_emotion-0.1.0-py3-none-any.whl:
Publisher:
publish.yml on synheart-ai/synheart-emotion-python
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
synheart_emotion-0.1.0-py3-none-any.whl -
Subject digest:
46d5fc03f350e937bcdfd53f0f1abef284384b89f9e053347765f744ecb7635a - Sigstore transparency entry: 787528183
- Sigstore integration time:
-
Permalink:
synheart-ai/synheart-emotion-python@f6ba35fdf44752501c3b25be383376c254250341 -
Branch / Tag:
refs/tags/v0.1.0 - Owner: https://github.com/synheart-ai
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@f6ba35fdf44752501c3b25be383376c254250341 -
Trigger Event:
release
-
Statement type: