Skip to main content

WaveAssist Python SDK for storing and retrieving structured data, LLM integration, and credit management

Project description

WaveAssist SDK & CLI 🌊

WaveAssist SDK makes it simple to store and retrieve data in your automation workflows. Access your projects through our Python SDK or CLI.


✨ Features

  • 🔐 One-line init() to connect with your WaveAssist project
  • ⚙️ Automatically works on local and WaveAssist Cloud (worker) environments
  • 📦 Store and retrieve data (DataFrames, JSON, strings)
  • 🧠 LLM-friendly function names (init, store_data, fetch_data)
  • 📁 Auto-serialization for common Python objects
  • 🤖 LLM integration with structured outputs via Instructor and OpenRouter
  • 💳 Credit management and automatic email notifications
  • 🖥️ Command-line interface for project management
  • ✅ Built for automation workflows, cron jobs, and AI pipelines

🚀 Getting Started

1. Install

pip install waveassist

2. Initialize the SDK

import waveassist

# Option 1: Use no arguments (recommended)
waveassist.init()

# Option 2: With explicit parameters
waveassist.init(
    token="your-user-id",
    project_key="your-project-key",
    environment_key="your-env-key",  # optional
    run_id="run-123",  # optional
    check_credits=True  # optional: raises error if credits_available is "0"
)

# Will auto-resolve from:
# 1. Explicit args (if passed)
# 2. .env file (uid, project_key, environment_key)
# 3. Worker-injected credentials (on [WaveAssist Cloud](https://waveassist.io))

🛠 Setting up .env (for local runs)

uid=your-user-id
project_key=your-project-key

# optional
environment_key=your-env-key

This file will be ignored by Git if you use our default .gitignore.


3. Store Data

🧾 Store a string

waveassist.store_data("welcome_message", "Hello, world!")

📊 Store a DataFrame

import pandas as pd

df = pd.DataFrame({"name": ["Alice", "Bob"], "score": [95, 88]})
waveassist.store_data("user_scores", df)

🧠 Store JSON/dict/array

profile = {"name": "Alice", "age": 30}
waveassist.store_data("profile_data", profile)

4. Fetch Data

result = waveassist.fetch_data("user_scores")

# Will return:
# - A DataFrame (if stored as one)
# - A dict/list (if stored as JSON)
# - A string (if stored as text)

5. Check Credits and Notify

Check OpenRouter credits and automatically send email notifications if insufficient credits are available:

# Check if you have enough credits for an operation
has_credits = waveassist.check_credits_and_notify(
    required_credits=10.5,
    assistant_name="WavePredict"
)

if has_credits:
    # Proceed with your operation
    print("Credits available, proceeding...")
else:
    # Credits insufficient - email notification sent (max 3 times)
    print("Insufficient credits, operation skipped")

Features:

  • Automatically checks OpenRouter credit balance
  • Sends email notification if credits are insufficient (max 3 times)
  • Resets notification count when credits become sufficient
  • Stores credit availability status for workflow control

6. Call LLM with Structured Outputs

Use Instructor library to get structured responses from LLMs via OpenRouter:

from pydantic import BaseModel

# Define your response structure
class UserInfo(BaseModel):
    name: str
    age: int
    email: str

# Call LLM with structured output
result = waveassist.call_llm(
    model="gpt-4o",
    prompt="Extract user info: John Doe, 30, john@example.com",
    response_model=UserInfo
)

print(result.name)  # "John Doe"
print(result.age)    # 30
print(result.email)  # "john@example.com"

Setup:

  1. Store your OpenRouter API key:
waveassist.store_data('open_router_key', 'your_openrouter_api_key')
  1. Use call_llm() with any Pydantic model for structured outputs

Advanced Usage:

result = waveassist.call_llm(
    model="anthropic/claude-3-opus",
    prompt="Analyze this data...",
    response_model=MyModel,
    max_tokens=3000,
    extra_body={"web_search_options": {"search_context_size": "medium"}}
)

🖥️ Command Line Interface

WaveAssist CLI comes bundled with the Python package. After installation, you can use the following commands:

🔑 Authentication

waveassist login

This will open your browser for authentication and store the token locally.

📤 Push Code

waveassist push PROJECT_KEY [--force]

Push your local Python code to a WaveAssist project.

📥 Pull Code

waveassist pull PROJECT_KEY [--force]

Pull Python code from a WaveAssist project to your local machine.

ℹ️ Version Info

waveassist version

Display CLI version and environment information.


🧪 Running Tests

If you’re not using pytest, just run the test script directly:

python tests/run_tests.py

✅ Includes tests for:

  • String roundtrip
  • JSON/dict roundtrip
  • DataFrame roundtrip
  • Error if init() is not called

🛠 Project Structure

WaveAssist/
├── waveassist/
│   ├── __init__.py          # init(), store_data(), fetch_data(), check_credits_and_notify(), call_llm()
│   ├── _config.py           # Global config vars
│   ├── constants.py         # Constants and email templates
│   ├── utils.py             # API utility functions
│   └── ...
├── tests/
│   └── run_tests.py         # Manual test runner

📌 Notes

  • Data is stored in your WaveAssist backend (e.g. MongoDB) as serialized content
  • store_data() auto-detects the object type and serializes it (CSV/JSON/text)
  • fetch_data() deserializes it back to the right Python object

🧠 Example Use Cases

Basic Data Storage

import waveassist
waveassist.init()  # Auto-initialized from .env or worker

# Store GitHub PR data
waveassist.store_data("latest_pr", {
    "title": "Fix bug in auth",
    "author": "alice",
    "status": "open"
})

# Later, fetch it for further processing
pr = waveassist.fetch_data("latest_pr")
print(pr["title"])

LLM Integration with Credit Management

import waveassist
from pydantic import BaseModel

waveassist.init()

# Store OpenRouter API key
waveassist.store_data('open_router_key', 'your_api_key')

# Check credits before expensive operation
required_credits = 5.0
if waveassist.check_credits_and_notify(required_credits, "MyAssistant"):
    # Use LLM with structured output
    class AnalysisResult(BaseModel):
        summary: str
        confidence: float
        recommendations: list[str]

    result = waveassist.call_llm(
        model="gpt-4o",
        prompt="Analyze this data and provide recommendations...",
        response_model=AnalysisResult
    )

    # Store the structured result
    waveassist.store_data("analysis_result", result.dict())

🤝 Contributing

Want to add formats, features, or cloud extensions? PRs welcome!


📬 Contact

Need help or have feedback? Reach out at connect@waveassist.io, visit WaveAssist.io, or open an issue.


© 2025 WaveAssist

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

waveassist-0.3.2.tar.gz (16.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

waveassist-0.3.2-py3-none-any.whl (15.2 kB view details)

Uploaded Python 3

File details

Details for the file waveassist-0.3.2.tar.gz.

File metadata

  • Download URL: waveassist-0.3.2.tar.gz
  • Upload date:
  • Size: 16.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for waveassist-0.3.2.tar.gz
Algorithm Hash digest
SHA256 2bb40da5993c8a2051fcaddb783e4cd86aabfa85cf9e70dca9da0793e7930aaa
MD5 03533bda9935ad4f8091b17be15cd2f5
BLAKE2b-256 4fbb5ceec83d2dc0ac14db0bd3bd00dcd28298e2afac5a6f19fe1630c3d70bc9

See more details on using hashes here.

Provenance

The following attestation bundles were made for waveassist-0.3.2.tar.gz:

Publisher: python-publish.yml on WaveAssist/WaveAssist

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file waveassist-0.3.2-py3-none-any.whl.

File metadata

  • Download URL: waveassist-0.3.2-py3-none-any.whl
  • Upload date:
  • Size: 15.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for waveassist-0.3.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a3d456c151e8ec29f4f767b2fce15a627f7d3973f3eb7c6391674c1042822442
MD5 0989e1941c820d2370e9c29d3f861abf
BLAKE2b-256 3fec60633e492ff0ad4a566d2ca3ab380b21579de3ea01f3d60b52c504bfc561

See more details on using hashes here.

Provenance

The following attestation bundles were made for waveassist-0.3.2-py3-none-any.whl:

Publisher: python-publish.yml on WaveAssist/WaveAssist

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page