Skip to main content

An experimental SDK for using Letta subagents for pluggable memory management

Project description

Memory SDK

An experimental SDK for using Letta agents for long-term memory and learning in a pluggable way. "Subconsious" Letta agents learn from data like conversational interactions, files, and other text content to generate learned context blocks that you can plug into your agent's system prompt - a form of "system prompt learning".

+========================================+
|         SYSTEM PROMPT                  |
+========================================+
|    LEARNED CONTEXT (USER)              | <- Subconscious Agent (learning from message history)
+========================================+
|    LEARNED CONTEXT (FILES)             | <- Subconscious Agent (learning from files) 
+========================================+
|           MESSAGES                     |
|  * User -> Assistant                   |
|  * User -> Assistant                   |
|  * User -> Assistant                   |
|  * ...                                 |
+========================================+

Quickstart

  1. Create an API Key
  2. Install: pip install letta-memory

Usage: Conversational Memory

You can save conversation histories using the Memory SDK, and later retrieve the learned context block to place into your system prompt. This allows your agents to have an evolving understand of the user. Example: Create a basic OpenAI gpt-4o-mini chat agent with memory

from openai import OpenAI
from memory import Memory

openai_client = OpenAI()
memory = Memory()

def chat_with_memories(message: str, user_id: str = "default_user") -> str:

    # get the user memory 
    user_memory = memory.get_user_memory(user_id)
    if not user_memory:
        memory.initialize_user_memory(user_id, reset=True)
        user_memory = memory.get_user_memory(user_id)
    
    # format the user memory 
    user_memory_prompt= memory.get_user_memory(user_id, prompt_formatted=True)

    # generate the assistant response
    system_prompt = f"<system>You are a helpful AI assistant</system>"
    system_prompt += f"\n{user_memory_prompt}"
    messages = [{"role": "system", "content": system_prompt}, {"role": "user", "content": message}]
    response = openai_client.chat.completions.create(model="gpt-4o-mini", messages=messages)
    assistant_response = response.choices[0].message.content

    # Create new memories from the conversation
    messages.append({"role": "assistant", "content": assistant_response})
    memory.add_messages(user_id, messages)

    return assistant_response

def main():
    print("Chat with AI (type 'exit' to quit)")
    while True:
        user_input = input("You: ").strip()
        if user_input.lower() == 'exit':
            print("Goodbye!")
            break
        print(f"AI: {chat_with_memories(user_input)}")

if __name__ == "__main__":
    main()

The memory will have a summary and user memory block that you can place into your system prompt.

<conversation_summary>
Sarah introduced herself and asked the assistant to tell about itself. The assistant provided a brief self-description and offered further help.
</conversation_summary>

<human description="Details about the human user you are speaking to.">
Name: Sarah
Interests: Likes cats (2025-09-03)
</human>

You can customize the prompt format by getting the raw summary or user block string with prompt_formatted=False.

Roadmap

  • Learning from files
  • Query historical messages
  • Save messages as archival memories
  • Query archival memory
  • Add "sleep" (offline collective revisioning of all data)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

letta_memory-0.1.0.tar.gz (89.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

letta_memory-0.1.0-py3-none-any.whl (96.0 kB view details)

Uploaded Python 3

File details

Details for the file letta_memory-0.1.0.tar.gz.

File metadata

  • Download URL: letta_memory-0.1.0.tar.gz
  • Upload date:
  • Size: 89.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.12

File hashes

Hashes for letta_memory-0.1.0.tar.gz
Algorithm Hash digest
SHA256 690e7ce033e44c0c155162a30991f80756864282580dfd270bc2ea4cf9519cc7
MD5 c36e34e6a1f1a9fd6a4eb4200f071fd3
BLAKE2b-256 8062832a7162e97b44f9ee88ee4641a34c24f1fe746d864be48752b0f9a2079a

See more details on using hashes here.

File details

Details for the file letta_memory-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: letta_memory-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 96.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.12

File hashes

Hashes for letta_memory-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 62da4fca360ef24553eecf6c7e7fe714058681a7f06dd5a19c62292681c9ed47
MD5 a59db8515cac0f709c321482a95a2ee0
BLAKE2b-256 5f130576e6da4732321fbf80a58da3737c44a6893b8e1371c7060bd1a22b7d63

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page