Skip to main content

Infra for AI Companions

Project description

Emotion Machine Python SDK

Official Python helper for the Emotion Machine Companion API. It wraps the /v1 endpoints documented in docs/client-companion-api-plan.md so you can provision companions, ingest knowledge, and chat/stream with them in just a few lines of code.

Installation

pip install emotion-machine

The client depends on httpx and targets Python 3.9+.

Quickstart

  1. Export your project API key (project-scoped):
export EM_API_KEY="emk_prod_...."
export EM_API_BASE_URL="https://api.emotionmachine.ai"  # or http://localhost:8100 for local dev
  1. Bootstrap a companion, ingest curated knowledge, and chat:
from emotion_machine import EmotionMachine

client = EmotionMachine()  # reads EM_API_KEY / EM_API_BASE_URL

# Create a fresh companion
companion = client.create_companion(
    name="Luteal Support Coach",
    description="Helps users track luteal phase cravings",
    config={
        "system_prompt": {
            "full_system_prompt": "You are an encouraging health coach."
        }
    },
)
companion_id = companion["id"]

# Optionally shape the profile schema for per-user traits
client.upsert_profile_schema(
    companion_id,
    schema={
        "type": "object",
        "properties": {
            "craving_intensity": {"type": "integer", "minimum": 0, "maximum":5}
        },
    },
)

# Ingest curated luteal-phase knowledge via the built-in key
job = client.ingest_knowledge(
    companion_id,
    payload_type="json",
    key="data_id_x",
)
job_result = client.wait_for_job(job["id"], timeout=20)
assert job_result["status"] == "succeeded", job_result

# Or upload & ingest your own JSON/Markdown/TXT file in one step.
# The helper uploads the file, kicks off ingestion, waits for completion,
# and raises if the job fails.
result = client.ingest_file(
    companion_id,
    file_path="important_app_related_knowledge.jsonl",
    payload_type="json",
)
print(result["job"]["status"])  # -> "succeeded"

# Run a synchronous chat completion
completion = client.chat_completion(
    companion_id,
    message="Hi! I'm feeling intense salt cravings today, what should I know?",
       external_user_id="user-123",
   )

print(completion["choices"][0]["message"]["content"])

# Stream responses (Server-Sent Events) and collect message chunks
stream = client.chat_stream(
    companion_id,
    message="Can you summarise key luteal phase symptoms?",
    external_user_id="user-123",
)
for event in stream:
    if event["event"] == "delta":
        chunk = event["data"]["choices"][0]["delta"].get("content", "")
        if chunk:
            print(chunk, end="", flush=True)
    elif event["event"] == "done":
         conversation_id = event["data"]["conversation_id"]

# Retrieve the full conversation transcript
transcript = client.get_conversation(conversation_id)
for message in transcript["messages"]:
    print(f"{message['role']}: {message['content']}")

# List the most recent conversations for a specific tester/user cohort
recent_sessions = client.list_conversations(
    companion_id,
    limit=25,
    external_user_id="user-123",
)
print(f"Found {len(recent_sessions)} saved sessions for user-123")

# Filter options include `external_user_prefix="beta-"` for cohort-level filtering.
  1. Tidy up when finished:
client.close()

or use with EmotionMachine() as client: to auto-close the HTTP session.

Knowledge management tips

  • client.list_knowledge_assets(companion_id) shows the latest uploads plus their statuses (ready, superseded, etc.).
  • Uploading a file with the same filename automatically replaces older ingestions for that companion’s vector store—no manual delete required.
  • client.search_knowledge(..., mode="semantic" | "keyword" | "hybrid") lets you compare retrieval strategies against the same dataset.
  • client.wait_for_job() and client.ingest_file() raise KnowledgeJobFailed if OpenAI reports the ingestion job as failed, so you can catch mistakes early in CI.

API Coverage

Resource Method SDK helper
Companions GET /v1/companions client.list_companions()
POST /v1/companions client.create_companion(...)
GET /v1/companions/{id} client.get_companion(id)
PATCH /v1/companions/{id} client.update_companion(...)
Profile Schema PUT /v1/companions/{id}/profile-schema client.upsert_profile_schema(...)
GET /v1/companions/{id}/profile-schema client.get_profile_schema(...)
Knowledge POST /v1/companions/{id}/knowledge client.ingest_knowledge(...)
GET /v1/knowledge-jobs/{job_id} client.knowledge.get_job(job_id)
Chat POST /v1/companions/{id}/chat client.chat_completion(...)
Chat (stream) POST /v1/companions/{id}/chat/stream client.chat_stream(...)
Conversations GET /v1/companions/{id}/conversations client.list_conversations(...)
Conversations GET /v1/conversations/{conversation_id} client.get_conversation(...)

All helpers raise emotion_machine.APIError on non-success HTTP codes. Inspect e.status_code and e.payload for diagnostics.

Development

cd packages/pip-emotion-machine
pip install -e .[dev]

The package ships from src/emotion_machine. Update pyproject.toml to bump versions.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

emotion_machine-0.0.6.tar.gz (7.7 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

emotion_machine-0.0.6-py3-none-any.whl (10.9 kB view details)

Uploaded Python 3

File details

Details for the file emotion_machine-0.0.6.tar.gz.

File metadata

  • Download URL: emotion_machine-0.0.6.tar.gz
  • Upload date:
  • Size: 7.7 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.21

File hashes

Hashes for emotion_machine-0.0.6.tar.gz
Algorithm Hash digest
SHA256 766caec739da885bc8ab2aa0842b60558039c80dfb60813b781172b89caddcc9
MD5 af84909c368935f07bf93066c7915109
BLAKE2b-256 ab1fcf6771c2d1395a08678d6ea803a11782ed412ef20f1103c703e9c966558d

See more details on using hashes here.

File details

Details for the file emotion_machine-0.0.6-py3-none-any.whl.

File metadata

File hashes

Hashes for emotion_machine-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 5fe9aa5eaf3e211136577de07eabb0f5b012e782f25c683c374fcc487a5564b1
MD5 1d68b371361d18013f1f73358dea0235
BLAKE2b-256 270d47ed8599a90983cdbd3a63deba9a7233c6049a2d130ed0d350c8c6c0833d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page