Skip to main content

Advanced Redis hotspot detection and analysis

Project description

SliceSight-Next

Advanced Redis hotspot detection and analysis tool for identifying load imbalances in Redis clusters.

Features

  • Real-time hotspot detection using multiple statistical metrics
  • Adaptive thresholding that adjusts based on cluster size and key distribution
  • Comprehensive analysis with load ratios, coefficient of variation, and Gini coefficients
  • CLI interface for simulation, scanning, and scoring
  • High performance with optimized algorithms for large-scale deployments

Installation

pip install slicesight-next

Quick Start

Command Line Usage

# Simulate key distribution
slicesight-hotshard simulate --keys 1000 --buckets 3 --auto-thresh

# Score existing load distribution
slicesight-hotshard score 100.0 200.0 150.0 --auto-thresh

# Check system health
slicesight-hotshard health

Python API

from slicesight_next import (
    redis_cluster_slot,
    load_ratio,
    calc_cv,
    calc_gini,
    auto_ratio_thresh,
    verdict
)

# Calculate Redis cluster slot
key = "user:12345"
slot = redis_cluster_slot(key)
print(f"Key '{key}' maps to slot {slot}")

# Analyze load distribution
loads = [100.0, 200.0, 150.0]
ratio = load_ratio(loads)
cv = calc_cv(loads)
gini = calc_gini(loads)

print(f"Load ratio: {ratio:.2f}")
print(f"Coefficient of variation: {cv:.2f}")
print(f"Gini coefficient: {gini:.2f}")

# Use adaptive threshold
n_keys = 1000
n_buckets = 3
threshold = auto_ratio_thresh(n_keys, n_buckets)
print(f"Adaptive threshold: {threshold:.3f}")

# Generate hotspot verdict
result = verdict(ratio, cv, gini, 0.1, threshold, 0.05)
print(f"Hotspot detected: {result['hotspot_detected']}")

Adaptive Threshold Formula

The adaptive threshold automatically adjusts based on your cluster configuration:

ρ_auto(n,k) = 1 / (1 + 3√((k−1)/n))

Where:

  • n = number of keys
  • k = number of buckets/nodes

This formula ensures that:

  • Small clusters get higher thresholds (more tolerance for imbalance)
  • Large clusters get lower thresholds (expect better distribution)
  • More keys result in lower thresholds (better distribution expected)

CLI Commands

simulate

Simulate Redis key distribution and detect hotspots:

slicesight-hotshard simulate [OPTIONS]

Options:
  --keys INTEGER          Number of keys to simulate [default: 1000]
  --buckets INTEGER       Number of buckets/nodes [default: 3]
  --ratio-thresh FLOAT    Load ratio threshold
  --auto-thresh          Use adaptive threshold
  --p-thresh FLOAT       P-value threshold [default: 0.05]
  --json                 Output in JSON format
  --seed INTEGER         Random seed for reproducibility

score

Score given load distribution:

slicesight-hotshard score LOADS... [OPTIONS]

Arguments:
  LOADS...  Load values for each node

Options:
  --buckets INTEGER       Number of buckets
  --ratio-thresh FLOAT    Load ratio threshold
  --auto-thresh          Use adaptive threshold
  --p-thresh FLOAT       P-value threshold [default: 0.05]
  --json                 Output in JSON format

scan

Scan Redis instance (future implementation):

slicesight-hotshard scan [OPTIONS]

Options:
  --host TEXT            Redis host [default: localhost]
  --port INTEGER         Redis port [default: 6379]
  --buckets INTEGER      Number of buckets/nodes [default: 3]
  --auto-thresh         Use adaptive threshold
  --json               Output in JSON format

Metrics Explained

Load Ratio

The ratio between the highest and lowest loaded nodes:

  • 1.0 = Perfect balance
  • >2.0 = Potential hotspot concern
  • >5.0 = Significant imbalance

Coefficient of Variation (CV)

Measures relative variability in the distribution:

  • 0.0 = No variation (perfect balance)
  • >1.0 = High variability, potential hotspots

Gini Coefficient

Measures inequality in load distribution:

  • 0.0 = Perfect equality
  • >0.5 = Significant inequality
  • 1.0 = Maximum inequality

Chi-square P-value

Tests if distribution differs significantly from uniform:

  • >0.05 = Distribution appears uniform
  • <0.05 = Significant deviation from uniform

Development

Setup

git clone https://github.com/slicesight/slicesight-next.git
cd slicesight-next
pip install -e ".[dev,test]"

Testing

# Run all tests
pytest

# Run with coverage
pytest --cov=slicesight_next --cov-report=html

# Run performance benchmarks
pytest tests/performance/ -v

# Run property-based tests
pytest tests/property/ -v

Code Quality

# Format and lint
ruff check .
ruff format .

# Type checking
mypy slicesight_next --strict

# Security scan
bandit -r slicesight_next

Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Performance

SliceSight-Next is optimized for production use:

  • CRC16 calculation: >100k ops/sec
  • Slot distribution: >50k keys/sec
  • Metrics calculation: >10k distributions/sec
  • Memory efficient: O(1) space complexity for most operations

Use Cases

  • Redis Cluster Monitoring: Detect hotspots in production clusters
  • Load Testing: Analyze key distribution in test scenarios
  • Capacity Planning: Model cluster behavior under different loads
  • Performance Tuning: Identify and resolve load imbalances

💬 Feedback & Support

We'd love to hear from you! SliceSight-Next is actively seeking user feedback to improve.

Quick Feedback

# Submit feedback directly via CLI
slicesight-hotshard feedback "Your thoughts here"
slicesight-hotshard feedback "Found a bug" --category bug --email you@company.com

Community & Support

What We're Looking For

  • Real-world Redis key patterns you're testing
  • Performance feedback on large datasets
  • Feature requests for better Redis monitoring
  • Use cases we haven't considered
  • Integration pain points

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

slicesight_next-0.2.0.tar.gz (22.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

slicesight_next-0.2.0-py3-none-any.whl (12.2 kB view details)

Uploaded Python 3

File details

Details for the file slicesight_next-0.2.0.tar.gz.

File metadata

  • Download URL: slicesight_next-0.2.0.tar.gz
  • Upload date:
  • Size: 22.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.11.2

File hashes

Hashes for slicesight_next-0.2.0.tar.gz
Algorithm Hash digest
SHA256 f92f15b62c1d6ad782246d09f702668ad19c396458c203e3a644ed2681c6524c
MD5 b256aa4ffdaff9ee2d962fdf2b332d12
BLAKE2b-256 497910a152cac65c06bbd38709318b49fa3072bb4e08da2442738c04bae95b9a

See more details on using hashes here.

File details

Details for the file slicesight_next-0.2.0-py3-none-any.whl.

File metadata

File hashes

Hashes for slicesight_next-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 c2a77303f81f30e9dca5d1764eff0c1040e30314978ffacacccc6740c801739f
MD5 3cc7348ea034a1a2c57fc73434e8c9d0
BLAKE2b-256 ab5db8124a93750c4ccd3127a98fd9f46d6e522860fd122a87493d002d826f42

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page