Skip to main content

Official Python SDK for AgentX (https://www.agentx.so/)

Project description

Logo

PyPI version


a fast way to build AI Agents and create agent workforce

The official AgentX Python SDK for AgentX

Why build AI agent with AgentX?

  • Simplicity, Agent - Conversation - Message structure.
  • Include chain-of-thoughts.
  • Choose from most open and closed sourced LLM vendors.
  • Built-in Voice(ASR, TTS), Image Gen, Document, CSV/excel tool, OCR, etc.
  • Support all running MCP (model context protocol).
  • Support RAG with built-in re-rank.
  • Multi-agent workforce orchestration.
  • Multiple agents working together with a designated manager agent.
  • Cross LLM vendor, multi-agent orchestration.
  • A2A - agent to agent protocol (coming soon)

Installation

pip install --upgrade agentx-python

Quick Start

Get started with AgentX in just a few lines of code:

from agentx import AgentX

# Initialize the client
client = AgentX(api_key="your-api-key-here")

# Get your agents
agents = client.list_agents()
print(f"You have {len(agents)} agents")

# Start chatting with your first agent
if agents:
    agent = agents[0]
    conversation = agent.new_conversation()
    response = conversation.chat("Hello! What can you help me with?")
    print(response)

Usage

Provide an api_key inline or set AGENTX_API_KEY as an environment variable. You can get an API key from https://app.agentx.so

Agent

from agentx import AgentX

client = AgentX(api_key="<your api key here>")

# Get the list of agents you have
print(client.list_agents())

Conversation

Each Conversation has agents and users tied to it.

# get agent
my_agent = client.get_agent(id="<agent id here>")

# Get the list of conversation from this agent
existing_conversations = my_agent.list_conversations()
print(existing_conversations)

# Get the list of history messages from a conversation
last_conversation = existing_conversations[-1]
msgs = last_conversation.list_messages()
print(msgs)

Chat

A chat needs to happen in the conversation. You can do stream response too, default False.

a_conversation = my_agent.get_conversation(id="<conversation id here>")

response = a_conversation.chat_stream("Hello, what is your name?")
for chunk in response:
    print(chunk)

output looks like:

text=None cot='The user is greeting and asking for my ' botId='xxx'
text=None cot='name, which are casual, straightforward questions.' botId='xxx'
text=None cot=' I can answer these directly' botId='xxx'
text='Hello' cot=None botId='xxx'
text='!' cot=None botId='xxx'
text=' I' cot=None botId='xxx'
text=' am' cot=None botId='xxx'
text=' AgentX' cot=None botId='xxx'
text=None cot=None botId='xxx'

*cot stands for chain-of-thoughts

Workforce

A Workforce (team) consists of multiple agents working together with a designated manager agent.

from agentx import AgentX

client = AgentX(api_key="<your api key here>")

# Get the list of workforces/teams you have
workforces = client.list_workforces()
print(workforces)

# Get a specific workforce
workforce = workforces[0]  # or any specific workforce
print(f"Workforce: {workforce.name}")
print(f"Manager: {workforce.manager.name}")
print(f"Agents: {[agent.name for agent in workforce.agents]}")

Workforce Conversations

# Create a new conversation with the workforce
conversation = workforce.new_conversation()

# List all existing conversations for the workforce
conversations = workforce.list_conversations()
print(conversations)

Chat with Workforce

Chat with the entire workforce team and get streaming responses from all agents.

# Stream chat with the workforce
response = workforce.chat_stream(conversation.id, "How can you help me with this project?")
for chunk in response:
    if chunk.text:
        print(chunk.text, end="")
    if chunk.cot:
        print(f" [COT: {chunk.cot}]")

The workforce chat allows you to leverage multiple specialized agents working together to provide comprehensive responses to your queries.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agentx_python-0.4.8.tar.gz (8.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agentx_python-0.4.8-py3-none-any.whl (9.0 kB view details)

Uploaded Python 3

File details

Details for the file agentx_python-0.4.8.tar.gz.

File metadata

  • Download URL: agentx_python-0.4.8.tar.gz
  • Upload date:
  • Size: 8.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for agentx_python-0.4.8.tar.gz
Algorithm Hash digest
SHA256 b4a41537b264b1b51653d75f6d96b1ec3df231cfc61aae6277ca5e033422c476
MD5 65a0aebdc9e5c7ffd7daf081f30b0a09
BLAKE2b-256 08d05b7e339244361e2a7ee75999970fa7fcca11f54d5b9b0d9095321eba8960

See more details on using hashes here.

File details

Details for the file agentx_python-0.4.8-py3-none-any.whl.

File metadata

  • Download URL: agentx_python-0.4.8-py3-none-any.whl
  • Upload date:
  • Size: 9.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.22

File hashes

Hashes for agentx_python-0.4.8-py3-none-any.whl
Algorithm Hash digest
SHA256 bfbafeb8aa45f553c909e5e271ce12670200a754b4d608d648a8413e91d68b89
MD5 bddb92ce7a5fa1ea92cdd192c03eba01
BLAKE2b-256 dbee0d47778ea2d4dcb869b5fbf136aea5ff38fd372b6ffec0913d125adb54a0

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page