Universal plugin for uAgents: connect LlamaIndex, CrewAI, LangChain, OpenAI, etc.
Project description
uAgents Plugin
A universal plugin that connects any AI agent framework (LlamaIndex, LangChain, LangGraph, CrewAI, or custom) with the uAgents ecosystem, enabling seamless communication within the Fetch.ai agent network.
🌟 Features
- Framework Agnostic: Works with LlamaIndex, LangChain, LangGraph, CrewAI, or any custom agent implementation
- Simple Integration: Just write a function - that's it!
- Session Management: Built-in support for session and user context tracking
- Agentverse Compatible: Optional integration with Fetch.ai's Agentverse platform
- Flexible Configuration: Use with or without authentication tokens and context management
- Async-First: Built on modern async/await patterns for optimal performance
📦 Installation
pip install uagents uagents-adapter
Install your preferred framework(s):
# For LlamaIndex
pip install llama-index
# For LangChain
pip install langchain
# For LangGraph
pip install langgraph
# For CrewAI
pip install crewai
🚀 Quick Start
The plugin works with any framework. Here's a minimal example:
from uagents import Agent
from uagents_adapter import uAgentsPlugin
async def my_agent_function(query: str, session_id: str, user_id: str) -> str:
# Your agent logic here (LlamaIndex, LangChain, CrewAI, custom, etc.)
response = "Your agent's response"
return response
plugin = uAgentsPlugin(my_agent_function)
agent = Agent(
name="my_agent",
seed="your_secure_seed_phrase",
port=8000,
mailbox=True
)
agent.include(plugin.protocol, publish_manifest=True)
plugin.run(agent)
🎯 Framework Examples
LlamaIndex Example
from uagents import Agent
from uagents_adapter import uAgentsPlugin
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.llms.openai import OpenAI
# Initialize your LlamaIndex agent
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine(llm=OpenAI(model="gpt-4"))
async def llamaindex_handler(query: str, session_id: str, user_id: str) -> str:
response = query_engine.query(query)
return str(response)
plugin = uAgentsPlugin(
llamaindex_handler,
description="LlamaIndex RAG agent"
)
agent = Agent(name="llamaindex_agent", seed="seed_phrase", port=8000, mailbox=True)
agent.include(plugin.protocol, publish_manifest=True)
plugin.run(agent)
LangChain Example
from uagents import Agent
from uagents_adapter import uAgentsPlugin
from langchain_openai import ChatOpenAI
from langchain.chains import ConversationChain
from langchain.memory import ConversationBufferMemory
# Initialize your LangChain agent
llm = ChatOpenAI(model="gpt-4")
memory = ConversationBufferMemory()
conversation = ConversationChain(llm=llm, memory=memory)
async def langchain_handler(query: str, session_id: str, user_id: str) -> str:
response = conversation.predict(input=query)
return response
plugin = uAgentsPlugin(
langchain_handler,
description="LangChain conversational agent"
)
agent = Agent(name="langchain_agent", seed="seed_phrase", port=8000, mailbox=True)
agent.include(plugin.protocol, publish_manifest=True)
plugin.run(agent)
CrewAI Example
from uagents import Agent
from uagents_adapter import uAgentsPlugin
from crewai import Agent as CrewAgent, Task, Crew
# Initialize your CrewAI setup
researcher = CrewAgent(
role='Researcher',
goal='Research and provide accurate information',
backstory='Expert researcher',
verbose=True
)
async def crewai_handler(query: str, session_id: str, user_id: str) -> str:
task = Task(description=query, agent=researcher)
crew = Crew(agents=[researcher], tasks=[task])
result = crew.kickoff()
return str(result)
plugin = uAgentsPlugin(
crewai_handler,
description="CrewAI research agent"
)
agent = Agent(name="crewai_agent", seed="seed_phrase", port=8000, mailbox=True)
agent.include(plugin.protocol, publish_manifest=True)
plugin.run(agent)
Custom Agent Example
from uagents import Agent
from uagents_adapter import uAgentsPlugin
class MyCustomAgent:
def __init__(self):
# Your custom initialization
pass
async def process(self, query: str) -> str:
# Your custom logic
return f"Processed: {query}"
custom_agent = MyCustomAgent()
async def custom_handler(query: str, session_id: str, user_id: str) -> str:
response = await custom_agent.process(query)
return response
plugin = uAgentsPlugin(
custom_handler,
description="My custom agent implementation"
)
agent = Agent(name="custom_agent", seed="seed_phrase", port=8000, mailbox=True)
agent.include(plugin.protocol, publish_manifest=True)
plugin.run(agent)
📖 Usage Scenarios
The plugin supports four main configuration patterns:
1. With Agentverse Token + Session Context
Use Case: Production deployment with full tracking and Agentverse integration
from dataclasses import dataclass
from uagents import Agent
from uagents_adapter import uAgentsPlugin
AGENTVERSE_API_TOKEN = "av_xxxxxxxxxxxxxxxxxxxxx"
@dataclass
class SessionContext:
session_id: str
user_id: str
async def handle_query(query: str, session_id: str, user_id: str) -> str:
"""Process query with full session context"""
context = SessionContext(session_id=session_id, user_id=user_id)
# Use context for personalized responses
# Your framework logic here (LlamaIndex, LangChain, etc.)
response = f"Response for user {context.user_id}: {query}"
return response
plugin = uAgentsPlugin(
handle_query,
agentverse_api_token=AGENTVERSE_API_TOKEN,
description="Production agent with session tracking"
)
agent = Agent(
name="production_agent",
seed="your_secure_seed_phrase_production",
port=8000,
mailbox=True
)
agent.include(plugin.protocol, publish_manifest=True)
plugin.run(agent)
2. Session Context Only (No Agentverse)
Use Case: Local development or private deployments with session tracking
from dataclasses import dataclass
from uagents import Agent
from uagents_adapter import uAgentsPlugin
@dataclass
class SessionContext:
session_id: str
user_id: str
async def handle_query(query: str, session_id: str, user_id: str) -> str:
"""Process query with session context, no Agentverse"""
context = SessionContext(session_id=session_id, user_id=user_id)
# Your framework logic here
response = f"Response for session {context.session_id}: {query}"
return response
plugin = uAgentsPlugin(handle_query)
agent = Agent(
name="dev_agent",
seed="your_secure_seed_phrase_dev",
port=8000,
mailbox=True
)
agent.include(plugin.protocol, publish_manifest=True)
plugin.run(agent)
3. With Agentverse Token (Stateless)
Use Case: Simple production deployment without session management
from uagents import Agent
from uagents_adapter import uAgentsPlugin
AGENTVERSE_API_TOKEN = "av_xxxxxxxxxxxxxxxxxxxxx"
async def handle_query(query: str, session_id: str, user_id: str) -> str:
"""Process query without maintaining session state"""
# Your framework logic here
response = f"Stateless response: {query}"
return response
plugin = uAgentsPlugin(
handle_query,
agentverse_api_token=AGENTVERSE_API_TOKEN,
description="Stateless agent for simple queries"
)
agent = Agent(
name="stateless_agent",
seed="your_secure_seed_phrase_stateless",
port=8000,
mailbox=True
)
agent.include(plugin.protocol, publish_manifest=True)
plugin.run(agent)
4. Minimal Configuration
Use Case: Quick prototyping and testing
from uagents import Agent
from uagents_adapter import uAgentsPlugin
async def handle_query(query: str, session_id: str, user_id: str) -> str:
"""Minimal query handler for testing"""
# Your framework logic here
response = f"Test response: {query}"
return response
plugin = uAgentsPlugin(handle_query)
agent = Agent(
name="test_agent",
seed="test_seed_phrase",
port=8000,
mailbox=True
)
agent.include(plugin.protocol, publish_manifest=True)
plugin.run(agent)
🔧 Configuration Options
uAgentsPlugin Parameters
| Parameter | Type | Required | Description |
|---|---|---|---|
query_handler |
Callable |
Yes | Async function that processes queries |
agentverse_api_token |
str |
No | Token for Agentverse platform integration |
description |
str |
No | Human-readable description of your agent |
Query Handler Signature
Your query handler must match this signature:
async def query_handler(
query: str, # The user's query
session_id: str, # Unique session identifier
user_id: str # Unique user identifier
) -> str: # Return response as string
pass
💡 Key Concept
The beauty of uAgentsPlugin is its simplicity: you just write a function.
The function signature is always the same regardless of which framework you use:
async def my_function(query: str, session_id: str, user_id: str) -> str:
# Your logic with any framework
return response
Inside this function, you can use:
- ✅ LlamaIndex query engines
- ✅ LangChain chains
- ✅ LangGraph workflows
- ✅ CrewAI crews
- ✅ Custom implementations
- ✅ API calls
- ✅ Database queries
- ✅ Anything else!
🔐 Security Best Practices
- Never commit tokens: Store
AGENTVERSE_API_TOKENin environment variables - Use strong seeds: Generate secure seed phrases for production agents
- Validate inputs: Always sanitize user queries before processing
- Rate limiting: Implement rate limiting for production deployments
import os
AGENTVERSE_API_TOKEN = os.getenv("AGENTVERSE_API_TOKEN")
AGENT_SEED = os.getenv("AGENT_SEED")
🏗️ Advanced Example with Context Management
Here's a complete example showing context-aware agent implementation:
from dataclasses import dataclass
from typing import Optional
from uagents import Agent
from uagents_adapter import uAgentsPlugin
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.llms.openai import OpenAI
@dataclass
class SessionContext:
session_id: str
user_id: str
class ContextAwareAgent:
def __init__(self, context: Optional[SessionContext] = None):
self.context = context
self.llm = OpenAI(model="gpt-4")
# Load your data
documents = SimpleDirectoryReader("data").load_data()
self.index = VectorStoreIndex.from_documents(documents)
self.query_engine = self.index.as_query_engine(llm=self.llm)
async def query(self, question: str) -> str:
"""Process a query with optional context"""
if self.context:
# Personalized query with user context
enhanced_query = f"[User: {self.context.user_id}] {question}"
response = self.query_engine.query(enhanced_query)
else:
response = self.query_engine.query(question)
return str(response)
# Handler function
async def handle_query(query: str, session_id: str, user_id: str) -> str:
context = SessionContext(session_id=session_id, user_id=user_id)
agent = ContextAwareAgent(context=context)
response = await agent.query(query)
return response
# Setup plugin
plugin = uAgentsPlugin(
handle_query,
agentverse_api_token=os.getenv("AGENTVERSE_API_TOKEN"),
description="Context-aware LlamaIndex agent"
)
agent = Agent(
name="context_aware_agent",
seed=os.getenv("AGENT_SEED"),
port=8000,
mailbox=True
)
agent.include(plugin.protocol, publish_manifest=True)
plugin.run(agent)
💡 Support
For questions and support:
- Open an issue on GitHub
- Join the Fetch.ai Discord community
- Check the documentation
Made with ❤️ for the Fetch.ai agent ecosystem
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file uagents_plugin-0.1.0.tar.gz.
File metadata
- Download URL: uagents_plugin-0.1.0.tar.gz
- Upload date:
- Size: 19.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
7e3bd263d6de2e5e0d6d18e28d209932b60c75be69bf2be6e9f8f98d856780e6
|
|
| MD5 |
13e1b7479b183549aa5f443187e4e5b7
|
|
| BLAKE2b-256 |
38fd9a3875206c3d951b38e18f99339b4e229363224deb4bb56388c72de7dcb2
|
File details
Details for the file uagents_plugin-0.1.0-py3-none-any.whl.
File metadata
- Download URL: uagents_plugin-0.1.0-py3-none-any.whl
- Upload date:
- Size: 17.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.2.0 CPython/3.11.5
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
e9ae5122a77229cb1974bf8a0acbba84638278c7a590da025c0787ba4b8d6f64
|
|
| MD5 |
6b9d54a33830151eeb68295de527fb0a
|
|
| BLAKE2b-256 |
92c648d43bc68920a4ad7a1c55c3a234664afd3b8faed6efdf29a7af3f873957
|