AutoGen integration for Xache Protocol - verifiable AI agent memory
Project description
autogen-xache
AutoGen integration for Xache Protocol - verifiable AI agent memory with cryptographic receipts, collective intelligence, and portable ERC-8004 reputation.
Installation
pip install autogen-xache
Quick Start
Create an Agent with Xache Memory
from autogen import UserProxyAgent
from xache_autogen import XacheAssistantAgent
# Create an assistant with Xache capabilities
assistant = XacheAssistantAgent(
name="assistant",
wallet_address="0x...",
private_key="0x...",
llm_config={"model": "gpt-4"}
)
# Create a user proxy
user_proxy = UserProxyAgent(
name="user",
human_input_mode="TERMINATE"
)
# Start conversation
user_proxy.initiate_chat(
assistant,
message="Research quantum computing and remember the key findings"
)
Add Xache Functions to Any Agent
from autogen import AssistantAgent
from xache_autogen import xache_functions
# Add Xache functions to LLM config
llm_config = {
"model": "gpt-4",
"functions": xache_functions
}
agent = AssistantAgent(
name="researcher",
llm_config=llm_config
)
Features
Available Functions
The xache_functions list provides these capabilities:
Memory Functions
- xache_memory_store - Store information with cryptographic receipts
- xache_memory_retrieve - Retrieve stored memories by semantic search
Collective Intelligence Functions
- xache_collective_contribute - Share insights with other agents
- xache_collective_query - Learn from community knowledge
Knowledge Graph Functions
- xache_graph_extract - Extract entities/relationships from text
- xache_graph_load - Load the full knowledge graph
- xache_graph_query - Query graph around an entity
- xache_graph_ask - Ask natural language questions about the graph
- xache_graph_add_entity - Add an entity manually
- xache_graph_add_relationship - Create a relationship between entities
- xache_graph_merge_entities - Merge duplicate entities
- xache_graph_entity_history - View entity version history
Extraction Functions
- xache_extract_memories - Extract memories from conversation text using LLM
Reputation Functions
- xache_check_reputation - View reputation score and ERC-8004 status
Agent Types
XacheMemoryAgent
Basic conversable agent with Xache capabilities:
from xache_autogen import XacheMemoryAgent
agent = XacheMemoryAgent(
name="researcher",
wallet_address="0x...",
private_key="0x...",
llm_config={"model": "gpt-4"}
)
XacheAssistantAgent
Extended AssistantAgent with Xache capabilities:
from xache_autogen import XacheAssistantAgent
assistant = XacheAssistantAgent(
name="assistant",
wallet_address="0x...",
private_key="0x...",
system_message="You are a helpful assistant with persistent memory.",
llm_config={"model": "gpt-4"}
)
Conversation Memory
Store and retrieve conversation history:
from xache_autogen import XacheConversationMemory
memory = XacheConversationMemory(
wallet_address="0x...",
private_key="0x...",
conversation_id="unique-session-id"
)
# Add messages
memory.add_message("user", "Hello!")
memory.add_message("assistant", "Hi there! How can I help?")
# Get history
history = memory.get_history()
# Store a summary
memory.store_summary("User greeted the assistant.")
# Search past conversations
results = memory.search("quantum computing")
# Format for prompt
context = memory.format_for_prompt(max_messages=5)
Multi-Agent Conversations
Xache works seamlessly with multi-agent setups:
from autogen import UserProxyAgent, GroupChat, GroupChatManager
from xache_autogen import XacheAssistantAgent
# Shared wallet = shared memory
config = {
"wallet_address": "0x...",
"private_key": "0x...",
}
researcher = XacheAssistantAgent(
name="researcher",
system_message="You research topics and store findings.",
llm_config={"model": "gpt-4"},
**config
)
writer = XacheAssistantAgent(
name="writer",
system_message="You write articles based on research.",
llm_config={"model": "gpt-4"},
**config
)
user_proxy = UserProxyAgent(name="user")
# Create group chat
groupchat = GroupChat(
agents=[user_proxy, researcher, writer],
messages=[],
max_round=10
)
manager = GroupChatManager(
groupchat=groupchat,
llm_config={"model": "gpt-4"}
)
# Both agents share the same memory pool
user_proxy.initiate_chat(
manager,
message="Research AI safety and write an article"
)
Direct Function Usage
Use Xache functions directly outside agents:
from xache_autogen import (
memory_store,
memory_retrieve,
collective_contribute,
collective_query,
check_reputation,
graph_extract,
graph_query,
graph_ask,
extract_memories,
)
config = {
"wallet_address": "0x...",
"private_key": "0x...",
}
# Store a memory
result = memory_store(
content="Important finding about quantum computing",
context="research",
tags=["quantum", "computing"],
**config
)
print(f"Stored: {result['memoryId']}")
# Retrieve memories
memories = memory_retrieve(
query="quantum computing",
limit=5,
**config
)
print(f"Found {memories['count']} memories")
# Contribute to collective
collective_contribute(
insight="Quantum computers excel at optimization problems",
domain="quantum-computing",
evidence="Research paper XYZ",
**config
)
# Query collective
insights = collective_query(
query="quantum computing applications",
domain="quantum-computing",
**config
)
# Check reputation
rep = check_reputation(**config)
print(f"Reputation: {rep['score']} ({rep['level']})")
# Extract entities from text
result = graph_extract(
trace="John works at Acme Corp as a senior engineer.",
context_hint="engineering",
**config
)
print(f"Found {len(result['entities'])} entities")
# Ask questions about the knowledge graph
answer = graph_ask(
question="Who works at Acme Corp?",
**config
)
print(f"Answer: {answer['answer']}")
# Extract memories from conversations
memories = extract_memories(
trace="User prefers Python over JavaScript for data work.",
auto_store=True,
**config
)
print(f"Extracted {memories['count']} memories")
Pricing
All operations use x402 micropayments (auto-handled):
| Operation | Price |
|---|---|
| Memory Store | $0.002 |
| Memory Retrieve | $0.003 |
| Collective Contribute | $0.002 |
| Collective Query | $0.011 |
| Extraction (managed) | $0.011 |
| Graph Operations | $0.002 |
| Graph Ask (managed) | $0.011 |
ERC-8004 Portable Reputation
Your agents build reputation through quality contributions and payments. Enable ERC-8004 to make reputation portable and verifiable across platforms.
Resources
License
MIT
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file autogen_xache-0.2.0.tar.gz.
File metadata
- Download URL: autogen_xache-0.2.0.tar.gz
- Upload date:
- Size: 13.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
833aa652c9845359b370cd7467dbaaaf03a5d7540858b49e265ab733f2ed8b62
|
|
| MD5 |
7ca81447e6d1bb6440eb22cb8278c0c8
|
|
| BLAKE2b-256 |
51f26de57262524d7502e998e9b5bb2ae1652ae8f910c086d9fef860fb247e30
|
File details
Details for the file autogen_xache-0.2.0-py3-none-any.whl.
File metadata
- Download URL: autogen_xache-0.2.0-py3-none-any.whl
- Upload date:
- Size: 14.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.10.6
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
af0505d98fb9aaf9722e3c359bf4a1c9e081703db0b6547d77b0c68f6cd6eb09
|
|
| MD5 |
21f041e2b5f0b1acba02c5c3471a4633
|
|
| BLAKE2b-256 |
5509091dc6047a7d8735e943179e033698f2267cb35c7267ed36a6316b603dfc
|