Skip to main content

Liberal Alpha Python SDK for interacting with gRPC-based backend

Project description

Liberal Alpha Python SDK

This SDK provides historical upload, historical download, local runner send, and real-time subscription APIs (HTTP + gRPC + WebSocket).

Install

pip install libre-alpha
# Optional: override default API base (default is https://api.librealpha.com)
export LIBALPHA_API_BASE="https://api.librealpha.com"

# Auth (X-API-Key)
export LIBALPHA_API_KEY="YOUR_API_KEY"

# Optional: record encryption key (used to encrypt uploads for encrypted records, and decrypt encrypted payloads)
export LIBALPHA_RECORD_ENCRYPTION_KEY="YOUR_RECORD_ENCRYPTION_KEY"

# Optional: local liberal_runner gRPC address for send_data()
export LIBALPHA_RUNNER_HOST="127.0.0.1"
export LIBALPHA_RUNNER_PORT="8128"

Initialize Client

from liberal_alpha.client import LiberalAlphaClient

# api_base defaults to https://api.librealpha.com
# can be overridden by env LIBALPHA_API_BASE or by passing api_base=...
client = LiberalAlphaClient(
    api_key="YOUR_API_KEY",            # optional if using env LIBALPHA_API_KEY
)

## Historical Upload API (Python) - protobuf stream (/api/entries/history/upload)

This is a **historical backfill** uploader that sends protobuf `DataEntry` messages in a length-prefixed stream:

- Request body format: `[4-byte big-endian length][DataEntry][4-byte length][DataEntry]...`
- Backend groups entries **by minute**; this SDK implementation will also **bucket by minute** and send **1 minute per request**.
- Auth: `X-API-Key` (required).
- Data is carried per-symbol in `symbol_values`:
  - For non-encrypted records: `symbol_values[i].values.items = [float, ...]`
  - For encrypted records: `symbol_values[i].encrypted_payload = <bytes>` (AES-256-GCM)

Client method (recommended):

```python
def upload_data(
    record_id: int,
    df: pd.DataFrame,
    encryption_key: str | None = None,
) -> None:
    ...
  • Auth is always taken from client.api_key (or env LIBALPHA_API_KEY when constructing the client).
  • encryption_key: record encryption key for encrypting payload when the record is encrypted. If omitted, the SDK uses env LIBALPHA_RECORD_ENCRYPTION_KEY.

Note: the SDK also exposes a module-level upload_data(...) with the same arguments; client.upload_data(...) delegates to it internally. (We only show the signature once here to avoid confusion.)

DataFrame format:

  • Required columns:
    • symbol (str): symbol_id string (e.g. "1", "3"). If you pass non-numeric symbol names, the SDK will best-effort map them to symbol_id via /api/records/user-records (target_symbols).
    • timestamp (datetime64 or unix timestamp; will be normalized to UTC and uploaded in microseconds)
  • Feature columns (choose ONE approach):
    • Provide a features column containing a list of floats (or a string like "[1.0, nan, 2.0]"), OR
    • Provide user-defined float columns matching the record schema; the SDK will fetch feature order from /api/records/{id}.

Upload packing behavior:

  • 1 request = 1 minute (backend requirement)
  • Within that minute, the SDK aggregates rows by exact timestamp into 1 DataEntry.
  • Each aggregated DataEntry carries multiple symbols in symbol_values: [{symbol_id, values.items}, {symbol_id, values.items}, ...]

Example (upload one minute from a previously-downloaded CSV):

import pandas as pd
from liberal_alpha.client import LiberalAlphaClient

df = pd.read_csv("history_1h_record12_BTC.csv")

# pick the first minute only (backend requires 1 minute per upload)
first_minute = int(df["timestamp"].iloc[0] // 60_000_000)
df1 = df[df["timestamp"].apply(lambda x: int(x // 60_000_000) == first_minute)].copy()
df1 = df1[["symbol", "timestamp", "features"]]

client = LiberalAlphaClient(api_key="YOUR_API_KEY")
client.upload_data(
    record_id=12,
    df=df1,
    # encryption_key="YOUR_RECORD_ENCRYPTION_KEY",  # only needed if record is encrypted (or set LIBALPHA_RECORD_ENCRYPTION_KEY)
)

Historical File Download API (Python) - protobuf (length-prefixed)

def download_historical_data(
    record_id: int,
    symbols: list[str],
    start: datetime | str | int,
    end: datetime | str | int,
    encryption_key: str | None = None,
) -> pandas.DataFrame:
    pass

Parameters:

  • record_id: the record id to download
  • symbols: list of symbol names (e.g. ["BINANCE_BTCUSDT"]). If you pass [], the SDK will best-effort fetch all symbols from /api/records/user-records (target_symbols).
  • start/end: datetime | ISO string | unix timestamp (sec/ms/us)
  • encryption_key: optional, record encryption key for decrypting symbol_values[].encrypted_payload

Notes:

  • Backend /api/entries/download-links enforces end-start <= 24h (microseconds). The SDK automatically splits long ranges into multiple 24h windows and merges results locally.
  • Returned timestamps are in microseconds.
  • DataEntry uses repeated SymbolValues symbol_values; each row corresponds to one symbol_id.
  • When data is encrypted (symbol_values[].encrypted_payload present), pass encryption_key to decrypt.

Example

from liberal_alpha.client import LiberalAlphaClient
import datetime as dt

client = LiberalAlphaClient(
    api_key="YOUR_API_KEY",
    api_base="https://api.librealpha.com",
)

start = dt.datetime(2026, 1, 1, 0, 0, 0, tzinfo=dt.timezone.utc)
end   = dt.datetime(2026, 1, 3, 0, 0, 0, tzinfo=dt.timezone.utc)  # >24h is OK (SDK will split)

df = client.download_historical_data(
    record_id=2,
    symbols=["BINANCE_BTCUSDT"],
    start=start,
    end=end,
    encryption_key="YOUR_RECORD_ENCRYPTION_KEY",  # for decrypting encrypted data
)

print(df.head())
print("rows:", len(df))

Local Runner Send API (Python) - gRPC ProcessJson

send_data(...) is still available in the Python SDK. It sends one JSON payload to a local liberal_runner through the gRPC JsonService/ProcessJson RPC.

Client method:

def send_data(
    *,
    identifier: str,
    data: dict,
    record_id: int,
    event_type: str = "raw",
    timeout: float | None = None,
):
    ...

Parameters:

  • identifier: caller-defined unique id for this payload. The SDK passes it to runner via request metadata.
  • data: JSON-serializable payload to send.
  • record_id: target backend record id. This is required and is sent in gRPC metadata.
  • event_type: runner processor type. Common values are raw and hash. For normal sends, use raw.
  • timeout: optional gRPC timeout in seconds. Defaults to the client timeout.

Notes:

  • Transport is gRPC, not HTTP.
  • The SDK sends to LIBALPHA_RUNNER_HOST:LIBALPHA_RUNNER_PORT (defaults to 127.0.0.1:8128), or to host= / port= passed when constructing LiberalAlphaClient.
  • The underlying RPC is JsonService/ProcessJson.
  • send_data(...) is the current send entrypoint for local runner usage. There is no separately documented send_alpha(...) API in the current SDK implementation.
  • send_data(...) can also send alpha-style payloads. In practice, runner does not distinguish "data" vs "alpha" by RPC method; what matters is that record_id and the fields inside data match the target record schema.
  • For alpha records, send the alpha fields directly in data (for example alpha_value) and normally keep event_type="raw".
  • If you include symbol in data, runner will use it. If omitted, runner may try to infer it from record configuration.

Example: send normal data to local runner

from liberal_alpha import LiberalAlphaClient

client = LiberalAlphaClient(
    api_key="YOUR_API_KEY",
    host="127.0.0.1",
    port=8128,
)

resp = client.send_data(
    identifier="test_data_001",
    record_id=13,
    event_type="raw",
    data={
        "symbol": "BINANCE_BTCUSDT",
        "Price": 150,
        "Volume": 200,
        "timestamp_ms": 1710000000000,
    },
)

print(resp)

Example: send alpha data to local runner

from liberal_alpha import LiberalAlphaClient

client = LiberalAlphaClient(
    api_key="YOUR_API_KEY",
    host="127.0.0.1",
    port=8128,
)

resp = client.send_data(
    identifier="alpha_001",
    record_id=4,
    event_type="raw",
    data={
        "symbol": "BINANCE_BTCUSDT",
        "alpha_value": 0.1234,
        "timestamp_ms": 1710000000000,
    },
)

print(resp)

Real-time Subscribe API (Python) - WebSocket ws/data

from typing import List, Callable

def subscribe_data(
    record_id: int,
    *,
    symbols: List[str],
    on_data_fn: Callable[[str, pd.DataFrame], None],
    encryption_key: str | None = None,
) -> None:
    """Subscribes to live data for a record into a DataFrame (schema matches publish).

    Args:
        record_id: Backend record identifier.
        symbols: Symbols to filter by (required).
        on_data_fn: Callback function invoked upon receiving new data. The
            function takes two arguments: the symbol (str) and a DataFrame
            containing the new data for that symbol.
        encryption_key: Optional, record encryption key for decrypting encrypted_payload.
    """

Parameters:

  • record_id: must be in user's subscriptions (from /api/subscriptions)
  • symbols: symbol_id strings to filter by (required). Only rows with symbol_id in this list are passed to on_data_fn.
  • on_data_fn: callback invoked upon receiving new data. Takes (symbol: str, df: DataFrame).
  • encryption_key: optional, record encryption key for decrypting symbol_values[].encrypted_payload; required if data is encrypted.

Notes:

  • Connects to ws/data, sends { wallet_address, record_id } on open.
  • If data is encrypted (symbol_values[].encrypted_payload has value), pass encryption_key or set LIBALPHA_RECORD_ENCRYPTION_KEY.
  • DataFrame schema matches download_historical_data (record_id, symbol, features/expanded feature columns, timestamps).
  • Blocking; run in a daemon thread for long-lived subscription.

Example

import threading
import pandas as pd
from liberal_alpha import LiberalAlphaClient

client = LiberalAlphaClient(api_key="YOUR_API_KEY", api_base="https://api.librealpha.com")

def on_data_fn(symbol: str, df: pd.DataFrame):
    print(f"[{symbol}] entry_id={df['entry_id'].iloc[0]} features_len={len(df['features'].iloc[0])}")

t = threading.Thread(
    target=client.subscribe_data,
    kwargs={"record_id": 2, "symbols": ["1", "3"], "on_data_fn": on_data_fn, "encryption_key": "YOUR_RECORD_ENCRYPTION_KEY"},
    daemon=True,
)
t.start()

print(df.head()) print("rows:", len(df))

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

libre_alpha-0.1.1.tar.gz (43.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

libre_alpha-0.1.1-py3-none-any.whl (43.2 kB view details)

Uploaded Python 3

File details

Details for the file libre_alpha-0.1.1.tar.gz.

File metadata

  • Download URL: libre_alpha-0.1.1.tar.gz
  • Upload date:
  • Size: 43.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for libre_alpha-0.1.1.tar.gz
Algorithm Hash digest
SHA256 735f57c155b634e9a10cbac4d66367d86992b48822b5435d54bad96e968e071c
MD5 fde3a65ec84b62f9e6d327ca1197d10b
BLAKE2b-256 8f8d307c91554d4431cc180342bbbd06b649618a23065851c8aa488bd4b4ad84

See more details on using hashes here.

File details

Details for the file libre_alpha-0.1.1-py3-none-any.whl.

File metadata

  • Download URL: libre_alpha-0.1.1-py3-none-any.whl
  • Upload date:
  • Size: 43.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for libre_alpha-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 f50ce9f6878e79f5396559e621be7cd16ae9948d6082690a9e1ca5cdf413b3cc
MD5 aec0dbd73dd1b545043283962fa5db20
BLAKE2b-256 00d937dc6d41f51bc0e76cad75c0dfaaa9582afa5e96a905bac714bf3599683a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page