Skip to main content

Liberal Alpha Python SDK for interacting with gRPC-based backend

Project description

Liberal Alpha Python SDK (Historical HTTP APIs)

This SDK provides historical upload, historical download, and real-time subscription APIs (HTTP + WebSocket).

Install

pip install liberal_alpha
# Optional: override default API base (default is https://api.librealpha.com)
export LIBALPHA_API_BASE="https://api.librealpha.com"

# Upload auth (X-API-Key)
export LIBALPHA_API_KEY="YOUR_API_KEY"

# Download auth (used to obtain JWT via /api/users/auth)
export LIBALPHA_PRIVATE_KEY="0xYOUR_PRIVATE_KEY"

Optional upload tuning:

export LIBALPHA_UPLOAD_BATCH_ID="12345"      # optional batch id (int)
export LIBALPHA_UPLOAD_CHUNK_SIZE="1048576"  # default 1MB
export LIBALPHA_UPLOAD_RESUME="1"            # 1=enable resume, 0=disable

Initialize Client

from liberal_alpha.client import LiberalAlphaClient

# api_base defaults to https://api.librealpha.com
# can be overridden by env LIBALPHA_API_BASE or by passing api_base=...
client = LiberalAlphaClient(
    api_key="YOUR_API_KEY",            # optional if using env LIBALPHA_API_KEY
    private_key="0xYOUR_PRIVATE_KEY",  # optional if using env LIBALPHA_PRIVATE_KEY
)

Historical Upload API (Python)

def upload_data(record_id: int, df: pandas.DataFrame) -> bool:
    pass

DataFrame Format

Your DataFrame must contain these columns:

-record_id (int)

-symbol (str)

-data (dict)  e.g. {"open": 50000.0, "close": 51000.0}

-timestamp (int)  milliseconds since epoch
(seconds are also accepted and will be auto-converted to milliseconds)

Example

import pandas as pd
from liberal_alpha.client import LiberalAlphaClient

client = LiberalAlphaClient(
    api_key="YOUR_API_KEY",
    # api_base defaults to https://api.librealpha.com
)

sample_data = []
for i in range(1000):
    sample_data.append({
        "record_id": 4,
        "symbol": "BNfBTC",
        "data": {"open": 50000.0 + i, "close": 51000.0 + i},
        "timestamp": 1733299200000 + i * 1000,
    })

df = pd.DataFrame(sample_data)

# Optional batch_id: set via env because public API has only 2 args
# export LIBALPHA_UPLOAD_BATCH_ID=12345
ok = client.upload_data(record_id=4, df=df)
print("Upload ok:", ok)

Notes:

-Upload uses X-API-Key authentication.

-Upload is chunked and supports resume if enabled (default enabled).

## Historical Upload API (Python) - protobuf stream (/api/entries/history/upload)

This is a **historical backfill** uploader that sends protobuf `DataEntry` messages in a length-prefixed stream:

- Request body format: `[4-byte big-endian length][DataEntry][4-byte length][DataEntry]...`
- Backend groups entries **by minute**; this SDK implementation will also **bucket by minute** and send **1 minute per request**.
- Auth: `X-API-Key` (required).
- For encrypted records: features are AES-256-GCM encrypted and written to `encrypted_payload`.
- For non-encrypted records: features are written to `features` (repeated double).

Public function:

```python
def upload_data(
    record_id: int,
    df: pd.DataFrame,
    api_key: str | None = None,
    private_key: str | None = None,
) -> None:
    pass

DataFrame format:

  • Required columns:
    • symbol (str)
    • timestamp (datetime64 or unix timestamp; will be normalized to UTC and uploaded in microseconds)
  • Feature columns (choose ONE approach):
    • Provide a features column containing a list of floats (or a string like "[1.0, nan, 2.0]"), OR
    • Provide user-defined float columns matching the record schema; the SDK will fetch feature order from /api/records/{id}.

Example (upload one minute from a previously-downloaded CSV):

import pandas as pd
from liberal_alpha import upload_data

df = pd.read_csv("history_1h_record12_BTC.csv")

# pick the first minute only (backend requires 1 minute per upload)
first_minute = int(df["timestamp"].iloc[0] // 60_000_000)
df1 = df[df["timestamp"].apply(lambda x: int(x // 60_000_000) == first_minute)].copy()
df1 = df1[["symbol", "timestamp", "features"]]

upload_data(
    record_id=12,
    df=df1,
    api_key="YOUR_API_KEY",
    private_key="YOUR_RECORD_PRIVATE_KEY",  # only needed if record is encrypted
)

Historical Download API (Python)

def download_data( record_id: int, symbols: list[str], dates: list[int], tz_info: datetime.tzinfo | str = "Asia/Singapore" ) -> pandas.DataFrame: pass

Historical File Download API (Python) - protobuf (length-prefixed)

def download_history_data(
    record_id: int,
    symbol: str,
    start: datetime | str | int,
    end: datetime | str | int,
    private_key: str | None = None,
) -> pandas.DataFrame:
    pass

Parameters:

  • record_id: the record id to download
  • symbol: symbol string, e.g. "ADA", "BTCUSDT"
  • start/end: datetime | ISO string | unix timestamp (sec/ms/us)
  • private_key: optional, record private key for decrypting encrypted_payload when features are empty

Notes:

  • Backend /api/entries/download-links enforces end-start <= 24h (microseconds). The SDK automatically splits long ranges into multiple 24h windows and merges results locally.
  • Returned timestamps are in microseconds.
  • DataEntry uses repeated double features; each row has features as a list of floats.
  • When data is encrypted (features empty, encrypted_payload present), pass private_key to decrypt. Plaintext format: uint32 count + float64[count] (AES-GCM, key = sha256(private_key)).

Example

from liberal_alpha.client import LiberalAlphaClient
import datetime as dt

client = LiberalAlphaClient(
    private_key="0xYOUR_PRIVATE_KEY",  # or api_key="YOUR_API_KEY"
    api_base="https://api.librealpha.com",
)

start = dt.datetime(2026, 1, 1, 0, 0, 0, tzinfo=dt.timezone.utc)
end   = dt.datetime(2026, 1, 3, 0, 0, 0, tzinfo=dt.timezone.utc)  # >24h is OK (SDK will split)

df = client.download_history_data(
    record_id=2,
    symbol="ADA",
    start=start,
    end=end,
    private_key="YOUR_RECORD_PRIVATE_KEY",  # for decrypting encrypted data
)

print(df.head())
print("rows:", len(df))

Real-time Subscribe API (Python) - WebSocket ws/data

from typing import List, Callable

def subscribe_data(
    record_id: int,
    *,
    symbols: List[str],
    on_data_fn: Callable[[str, pd.DataFrame], None],
    private_key: str | None = None,
) -> None:
    """Subscribes to live data for a record into a DataFrame (schema matches publish).

    Args:
        record_id: Backend record identifier.
        symbols: Symbols to filter by (required).
        on_data_fn: Callback function invoked upon receiving new data. The
            function takes two arguments: the symbol (str) and a DataFrame
            containing the new data for that symbol.
        private_key: Optional, record private key for decrypting encrypted_payload.
    """

Parameters:

  • record_id: must be in user's subscriptions (from /api/subscriptions)
  • symbols: symbols to filter by (required). Only rows with symbol in this list are passed to on_data_fn.
  • on_data_fn: callback invoked upon receiving new data. Takes (symbol: str, df: DataFrame).
  • private_key: optional, record private key for decrypting encrypted_payload; required if data is encrypted.

Notes:

  • Connects to ws/data, sends { wallet_address, record_id } on open.
  • If data is encrypted (encrypted_payload has value), pass private_key or set LIBALPHA_RECORD_PRIVATE_KEY.
  • DataFrame schema matches download_history_data (entry_id, record_id, symbol, features, timestamps).
  • Blocking; run in a daemon thread for long-lived subscription.

Example

import threading
import pandas as pd
from liberal_alpha import LiberalAlphaClient

client = LiberalAlphaClient(api_key="YOUR_API_KEY", api_base="https://api.librealpha.com")

def on_data_fn(symbol: str, df: pd.DataFrame):
    print(f"[{symbol}] entry_id={df['entry_id'].iloc[0]} features_len={len(df['features'].iloc[0])}")

t = threading.Thread(
    target=client.subscribe_data,
    kwargs={"record_id": 2, "symbols": ["YFI", "BTCUSDT"], "on_data_fn": on_data_fn, "private_key": "YOUR_RECORD_KEY"},
    daemon=True,
)
t.start()

Historical Download API (download_data) Parameters

-record_id: the record id to download

-symbols: list of symbols, e.g. ["BTCUSDT", "ETHUSDT"]

If you pass [], the SDK will automatically fetch all symbols (if supported by backend).

-dates: list of local dates in YYYYMMDD format, e.g. [20251214, 20251215]

If you pass [], no date filter is applied.

-tz_info: controls how local_date is computed

"Asia/Singapore" (IANA tz string)

numeric offsets like 8, -4, or strings like "+8", "-4"

8 means UTC+8 (SGT/HKT)

-4 means UTC-4 (NYC during DST)

Example

from liberal_alpha.client import LiberalAlphaClient

client = LiberalAlphaClient( private_key="0xYOUR_PRIVATE_KEY", # api_base defaults to https://api.librealpha.com )

df = client.download_data( record_id=24, symbols=[], # empty => auto fetch all symbols dates=[], # empty => no date filter tz_info="Asia/Singapore" # or tz_info=8 )

print(df.head()) print("rows:", len(df))

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

liberal_alpha-0.1.20.tar.gz (44.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

liberal_alpha-0.1.20-py3-none-any.whl (44.8 kB view details)

Uploaded Python 3

File details

Details for the file liberal_alpha-0.1.20.tar.gz.

File metadata

  • Download URL: liberal_alpha-0.1.20.tar.gz
  • Upload date:
  • Size: 44.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for liberal_alpha-0.1.20.tar.gz
Algorithm Hash digest
SHA256 c001e6ef850d455570fc0352655c16ad6057e4b4b09be932f8db71485df57bca
MD5 c997fbe77611a023d7975bb784a12cef
BLAKE2b-256 5b827e8f53f7f1d4dbbd03368c266216921cf2999fc31c9022983f8a0080305b

See more details on using hashes here.

File details

Details for the file liberal_alpha-0.1.20-py3-none-any.whl.

File metadata

  • Download URL: liberal_alpha-0.1.20-py3-none-any.whl
  • Upload date:
  • Size: 44.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for liberal_alpha-0.1.20-py3-none-any.whl
Algorithm Hash digest
SHA256 d2e5802dbe41172ef924101d52fc56d65abdc42f2aac3a3fca7b94c624acda98
MD5 11bb53cb04b17b9c91b167f4244263a1
BLAKE2b-256 60a368b2eb47a726bda70b05379346954a2e178fd24c511fcc7109cf96ce6f6e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page