Skip to main content

Python client for alphainfo.io — Structure-aware analysis for any time series

Project description

alphainfo

Python client for the alphainfo Structural Intelligence API.

Detect structural regime changes in time series — biomedical signals, financial markets, energy grids, seismic data, IoT sensors, and more. No model training required.

from alphainfo import AlphaInfo

client = AlphaInfo(api_key="ai_your_key")
result = client.analyze(signal=ecg_data, sampling_rate=360.0, domain="biomedical")

print(result.confidence_band)   # 'stable', 'transition', or 'unstable'
print(result.structural_score)  # 0.0 to 1.0
print(result.analysis_id)       # UUID for audit trail

Installation

pip install alphainfo

# Optional: enable HTTP/2 for better connection efficiency
pip install alphainfo[http2]

Requires Python 3.8+. Core dependency: httpx.

Quick Start

1. Get your API key

Sign up at alphainfo.io/register — free tier includes 50 analyses/month.

2. Analyze a signal

from alphainfo import AlphaInfo

client = AlphaInfo(api_key="ai_your_key")

# Any time series: ECG, market prices, sensor readings, power grid...
result = client.analyze(
    signal=[1.2, 1.3, 1.1, 2.8, 3.1, 3.0, ...],
    sampling_rate=250.0,
    domain="biomedical",
)

if result.change_detected:
    print(f"Regime change detected! Band: {result.confidence_band}")
    print(f"Structural score: {result.structural_score:.3f}")
    print(f"Audit ID: {result.analysis_id}")

3. Structural fingerprint (fast path)

# Extract the 5D structural fingerprint — skips semantic + multiscale for speed
fp = client.fingerprint(signal=data, sampling_rate=250.0, domain="biomedical")

print(fp.structural_score)    # 0.0 to 1.0
print(fp.confidence_band)     # 'stable', 'transition', 'unstable'
print(fp.vector)              # [sim_local, sim_spectral, sim_fractal, sim_transition, sim_trend]

# Use .vector for nearest-neighbor search / ANN indexing
from sklearn.neighbors import NearestNeighbors
vectors = [client.fingerprint(s, 250.0).vector for s in signal_corpus]
nn = NearestNeighbors(n_neighbors=5).fit(vectors)

4. Batch analysis

# Analyze up to 100 signals in one call
batch = client.analyze_batch(
    signals=[signal_1, signal_2, signal_3],
    sampling_rate=1000.0,
    domain="sensors",
)

for item in batch.results:
    if item.success:
        print(f"Signal {item.index}: {item.confidence_band} ({item.structural_score:.3f})")
    else:
        print(f"Signal {item.index}: error — {item.error}")

5. Semantic layer (severity, trend, alerts)

result = client.analyze(
    signal=data, sampling_rate=1.0,
    include_semantic=True,
    baseline=calm_period,
)

if result.semantic:
    print(result.semantic.alert_level)       # 'normal', 'attention', 'alert', 'critical'
    print(result.semantic.severity)          # 'none', 'low', 'moderate', 'high', 'critical'
    print(result.semantic.severity_score)    # 0-100 (higher = more severe)
    print(result.semantic.trend)             # 'stable', 'diverging', 'monitoring'
    print(result.semantic.summary)           # "⚠️ Structural divergence detected (severity: high)"
    print(result.semantic.recommended_action)  # 'log_only', 'monitor', 'human_review', 'immediate_human_review'

# Short signal warning (< 100 samples)
if result.warning:
    print(result.warning)  # "Signal has only 30 samples..."

Severity thresholds:

severity severity_score Meaning
none 0-15 No structural degradation
low 16-35 Minor deviation, monitor
moderate 36-65 Notable change, investigate
high 66-85 Significant regime shift
critical 86-100 Severe structural breakdown

6. Multi-channel (vector) analysis with per-channel baselines

# Multi-lead ECG, multi-axis accelerometer, cross-asset finance...
vector = client.analyze_vector(
    channels={
        "lead_I": ecg_lead_1,
        "lead_II": ecg_lead_2,
        "lead_III": ecg_lead_3,
    },
    sampling_rate=360.0,
    domain="biomedical",
)

print(f"Aggregated score: {vector.structural_score:.3f}")
print(f"Composite band: {vector.confidence_band}")
for name, ch in vector.channels.items():
    print(f"  {name}: {ch.confidence_band} (score={ch.structural_score:.3f})")

# With per-channel baselines (e.g. calm period reference)
vector = client.analyze_vector(
    channels={"SPY": spy_data, "VIX": vix_data, "GLD": gld_data},
    sampling_rate=1.0,
    baselines={"SPY": spy_calm, "VIX": vix_calm, "GLD": gld_calm},
)

7. Audit trail

# Replay any past analysis
replay = client.audit_replay("550e8400-e29b-41d4-a716-446655440000")
print(f"Original score: {replay.output['structural_score']}")

# List recent analyses
history = client.audit_list(limit=10)
for entry in history:
    print(f"{entry.analysis_id}{entry.structural_score}")

8. API guide (discoverability)

# Fetch the full encoding guide — endpoints, patterns, tips, debugging
guide = client.guide()
print(guide["version"])            # "1.1"
print(list(guide.keys()))          # all available sections

# Common mistakes
for m in guide["common_mistakes"]:
    print(f"- {m['mistake']}: {m['fix']}")

# Which endpoint to use
for name, info in guide["endpoints"].items():
    print(f"{name}: {info.get('path', '')}{info.get('when', '')}")

9. Version and compatibility

info = client.version()
print(info["api_version"])                      # "2.2.1"
print(info["sdk_compat"]["recommended_version"])  # "1.5.0"
print(info["features"])                          # dict of supported features
print(info["limits"]["max_batch_size"])           # 100

Async Support

from alphainfo import AsyncAlphaInfo

async with AsyncAlphaInfo(api_key="ai_your_key") as client:
    result = await client.analyze(signal=data, sampling_rate=250.0)
    fp = await client.fingerprint(signal=data, sampling_rate=250.0)

All methods available on AlphaInfo are also available on AsyncAlphaInfo.

Error Handling

from alphainfo import AlphaInfo, AuthError, RateLimitError, ValidationError

client = AlphaInfo(api_key="ai_your_key")

try:
    result = client.analyze(signal=data, sampling_rate=250.0)
except AuthError:
    print("Invalid API key")
except RateLimitError as e:
    print(f"Rate limited. Retry after {e.retry_after}s")
except ValidationError as e:
    print(f"Invalid input: {e.message}")

Exception hierarchy:

Exception HTTP Code When
AuthError 401 Invalid or missing API key
ValidationError 400, 413 Bad input or signal too large
RateLimitError 429 Quota or concurrency limit exceeded
NotFoundError 404 Analysis ID not found (audit)
APIError 5xx Server error
TimeoutError Request timed out after retries
NetworkError Connection failed

All inherit from AlphaInfoError.

Configuration

client = AlphaInfo(
    api_key="ai_your_key",
    base_url="https://alphainfo.io",  # default
    timeout=30.0,                      # seconds (default)
    max_retries=3,                     # automatic retry on transient errors
    retry_base_delay=1.0,              # initial backoff delay (seconds)
    retry_max_delay=32.0,              # max delay between retries (seconds)
    http2=None,                        # auto-detect (True if h2 installed)
)

The client automatically retries on:

  • Network timeouts and connection errors
  • HTTP 429 (rate limits) — respects Retry-After header
  • HTTP 5xx (server errors)

Non-retryable errors (401, 400, 404) are raised immediately.

Backoff is exponential: retry_base_delay * 2^attempt, capped at retry_max_delay.

Rate Limit Info

result = client.analyze(signal=data, sampling_rate=250.0)
info = client.rate_limit_info
if info:
    print(f"Remaining: {info.remaining}/{info.limit}")

Signal Size Guide

Samples Behavior Recommendation
< 10 Rejected (422) Hard minimum
10-49 Returns 0.5 + warning Too short for multiscale
50-99 Returns 0.5 + warning Limited confidence
100-199 Variable scores Detection active, less reliable
200-500 Reliable scores Recommended range
500+ Reliable, may dilute point events Use windowing for point detection

Note: sampling_rate controls multiscale window sizing but does not change scores for a given signal. For daily financial data use sampling_rate=1.0; for ECG at 250Hz use sampling_rate=250.0.

Domains

Domain Use case
generic Default — works for any signal
biomedical ECG, EEG, EMG, SpO2
finance Market prices, returns, volume
energy Power grid frequency, load
seismic Earthquake, vibration sensors
sensors IoT, industrial sensors
mlops Model drift, data quality
security Network traffic, intrusion
industrial Machinery, SCADA

Guides

All guide content is available programmatically via client.guide() and the live API at GET /v1/guide:

guide = client.guide()  # returns all 15 sections, no auth required

guide["common_mistakes"]   # 10 pitfalls with symptoms and fixes
guide["performance_tips"]  # fast mode, batch vs loop, HTTP/2, retry tuning
guide["debugging_tips"]    # step-by-step troubleshooting + error hierarchy
guide["endpoints"]         # all endpoints — when to use, latency, quota cost

Full markdown versions are also included in the installed package under alphainfo/guides/.

Links

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

alphainfo-1.5.2.tar.gz (52.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

alphainfo-1.5.2-py3-none-any.whl (28.6 kB view details)

Uploaded Python 3

File details

Details for the file alphainfo-1.5.2.tar.gz.

File metadata

  • Download URL: alphainfo-1.5.2.tar.gz
  • Upload date:
  • Size: 52.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for alphainfo-1.5.2.tar.gz
Algorithm Hash digest
SHA256 053bc69f255ceca7244f603f6e9d2e34ffa5ee802f497950ddeb9529d201b3a1
MD5 461e9254147e697a8ff20eaf2118255a
BLAKE2b-256 545ace4c539863ab0a2e82c5c3916204b9be13df365da5399672dfe5ccd43df7

See more details on using hashes here.

File details

Details for the file alphainfo-1.5.2-py3-none-any.whl.

File metadata

  • Download URL: alphainfo-1.5.2-py3-none-any.whl
  • Upload date:
  • Size: 28.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for alphainfo-1.5.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a1bac7412263bb029fec27c5071ef643a7d2aefa56cb21fdc5d66061d378ac01
MD5 6e2e8c3ce66f8b2de4026516e25cd307
BLAKE2b-256 665ca206cb37c184236c6bfdfde415b9cd6fa13a25e54b734014e25892ca40ce

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page