Skip to main content

Memory tools for OpenAI function calling with supermemory

Project description

Supermemory OpenAI Python SDK

Memory tools for OpenAI function calling with Supermemory integration.

This package provides memory management tools for the official OpenAI Python SDK using Supermemory capabilities.

Installation

Install using uv (recommended):

uv add supermemory-openai-sdk

Or with pip:

pip install supermemory-openai-sdk

Quick Start

Using Memory Tools with OpenAI

import asyncio
import openai
from supermemory_openai import SupermemoryTools, execute_memory_tool_calls

async def main():
    # Initialize OpenAI client
    client = openai.AsyncOpenAI(api_key="your-openai-api-key")
    
    # Initialize Supermemory tools
    tools = SupermemoryTools(
        api_key="your-supermemory-api-key",
        config={"project_id": "my-project"}
    )
    
    # Chat with memory tools
    response = await client.chat.completions.create(
        model="gpt-4o",
        messages=[
            {
                "role": "system",
                "content": "You are a helpful assistant with access to user memories."
            },
            {
                "role": "user", 
                "content": "Remember that I prefer tea over coffee"
            }
        ],
        tools=tools.get_tool_definitions()
    )
    
    # Handle tool calls if present
    if response.choices[0].message.tool_calls:
        tool_results = await execute_memory_tool_calls(
            api_key="your-supermemory-api-key",
            tool_calls=response.choices[0].message.tool_calls,
            config={"project_id": "my-project"}
        )
        print("Tool results:", tool_results)
    
    print(response.choices[0].message.content)

asyncio.run(main())

Configuration

Memory Tools

SupermemoryTools Class

from supermemory_openai import SupermemoryTools

tools = SupermemoryTools(
    api_key="your-supermemory-api-key",
    config={
        "project_id": "my-project",  # or use container_tags
        "base_url": "https://custom-endpoint.com",  # optional
    }
)

# Search memories
result = await tools.search_memories(
    information_to_get="user preferences",
    limit=10,
    include_full_docs=True
)

# Add memory  
result = await tools.add_memory(
    memory="User prefers tea over coffee"
)

# Fetch specific memory
result = await tools.fetch_memory(
    memory_id="memory-id-here"
)

Individual Tools

from supermemory_openai import (
    create_search_memories_tool,
    create_add_memory_tool, 
    create_fetch_memory_tool
)

search_tool = create_search_memories_tool("your-api-key")
add_tool = create_add_memory_tool("your-api-key")
fetch_tool = create_fetch_memory_tool("your-api-key")

Function Calling Integration

from supermemory_openai import execute_memory_tool_calls

# After getting tool calls from OpenAI
if response.choices[0].message.tool_calls:
    tool_results = await execute_memory_tool_calls(
        api_key="your-supermemory-api-key",
        tool_calls=response.choices[0].message.tool_calls,
        config={"project_id": "my-project"}
    )
    
    # Add tool results to conversation
    messages.append(response.choices[0].message)
    messages.extend(tool_results)

API Reference

SupermemoryTools

Memory management tools for function calling.

Constructor

SupermemoryTools(
    api_key: str,
    config: Optional[SupermemoryToolsConfig] = None
)

Methods

  • get_tool_definitions() - Get OpenAI function definitions
  • search_memories() - Search user memories
  • add_memory() - Add new memory
  • fetch_memory() - Fetch specific memory by ID
  • execute_tool_call() - Execute individual tool call

Error Handling

try:
    response = await client.chat_completion(
        messages=[{"role": "user", "content": "Hello"}],
        model="gpt-4o"
    )
except Exception as e:
    print(f"Error: {e}")

Environment Variables

Set these environment variables for testing:

  • SUPERMEMORY_API_KEY - Your Supermemory API key
  • OPENAI_API_KEY - Your OpenAI API key
  • MODEL_NAME - Model to use (default: "gpt-4o-mini")
  • SUPERMEMORY_BASE_URL - Custom Supermemory base URL (optional)

Development

Setup

# Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh

# Clone and setup
git clone <repository-url>
cd packages/openai-sdk-python
uv sync --dev

Testing

# Run tests
uv run pytest

# Run with coverage
uv run pytest --cov=supermemory_openai

# Run specific test file
uv run pytest tests/test_infinite_chat.py

Type Checking

uv run mypy src/supermemory_openai

Formatting

uv run black src/ tests/
uv run isort src/ tests/

License

MIT License - see LICENSE file for details.

Links

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

supermemory_openai_sdk-1.0.1.tar.gz (71.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

supermemory_openai_sdk-1.0.1-py3-none-any.whl (6.8 kB view details)

Uploaded Python 3

File details

Details for the file supermemory_openai_sdk-1.0.1.tar.gz.

File metadata

File hashes

Hashes for supermemory_openai_sdk-1.0.1.tar.gz
Algorithm Hash digest
SHA256 d07c216de4c26fee8188159ace91cf233d8dec6679525017a97267d9ecd9a95f
MD5 07ec25ae1644a0b70b38b3b1a58525a3
BLAKE2b-256 5262773b2418d59e44118585f37ced2191b4ef9559269e2eb87cb2c267dc1c81

See more details on using hashes here.

File details

Details for the file supermemory_openai_sdk-1.0.1-py3-none-any.whl.

File metadata

File hashes

Hashes for supermemory_openai_sdk-1.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 9f4d32e1e211a3b08e3c5d2fd0e2676322f8f8de77e2459f7e10546e4bbfccdc
MD5 8f8988a4848dc1e57548ff36b5916e92
BLAKE2b-256 35bf3fff704fe86189582b5d6ed90de7dec28e917edb861b408442bd743b2b60

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page