Skip to main content

Task oriented AI agent framework for digital workers and vertical AI agents

Project description



unclecode%2Fcrawl4ai | Trendshift Made_with_python pypi_downloads

What is Upsonic?

Upsonic offers a cutting-edge enterprise-ready framework where you can orchestrate LLM calls, agents, and computer use to complete tasks cost-effectively. It provides more reliable systems, scalability, and a task-oriented structure that you need while completing real-world cases.

Key features:

  • Production-Ready Scalability: Deploy seamlessly on AWS, GCP, or locally using Docker.
  • Task-Centric Design: Focus on practical task execution, with options for:
    • Basic tasks via LLM calls.
    • Advanced tasks with V1 agents.
    • Complex automation using V2 agents with MCP integration.
  • MCP Server Support: Utilize multi-client processing for high-performance tasks.
  • Tool-Calling Server: Exception-secure tool management with robust server API interactions.
  • Computer Use Integration: Execute human-like tasks using Anthropic’s ‘Computer Use’ capabilities.
  • Easily adding tools: You can add your custom tools and MCP tools with a single line of code.
  • Client-server arthitecture: Production ready stateless enterprise ready system


🛠️ Getting Started

Prerequisites

  • Python 3.10 or higher
  • Access to OpenAI or Anthropic API keys (Azure and Bedrock Supported)

Installation

pip install upsonic==0.36.0a1737482268


Creating a Client

Create a client to manage tools and tasks:

from upsonic import UpsonicClient, ObjectResponse, Task, AgentConfiguration
from upsonic.client.tools import Search

# Create an Upsonic client instance
client = UpsonicClient("localserver")

client.set_config("OPENAI_API_KEY", "YOUR_API_KEY")
client.default_llm_model = "openai/gpt-4o"

Other LLM's

  • claude-3-5-sonnet
client.set_config("ANTHROPIC_API_KEY", "YOUR_ANTHROPIC_API_KEY")
client.default_llm_model = "claude/claude-3-5-sonnet"
  • gpt-4o-azure
client.set_config("AZURE_OPENAI_ENDPOINT", "YOUR_AZURE_OPENAI_ENDPOINT")
client.set_config("AZURE_OPENAI_API_VERSION", "YOUR_AZURE_OPENAI_API_VERSION")
client.set_config("AZURE_OPENAI_API_KEY", "YOUR_AZURE_OPENAI_API_KEY")

client.default_llm_model = "azure/gpt-4o"
  • claude-3-5-sonnet-aws
client.set_config("AWS_ACCESS_KEY_ID", "YOUR_AWS_ACCESS_KEY_ID")
client.set_config("AWS_SECRET_ACCESS_KEY", "YOUR_AWS_SECRET_ACCESS_KEY")
client.set_config("AWS_REGION", "YOUR_AWS_REGION")

client.default_llm_model = "bedrock/claude-3-5-sonnet"


Defining a Task

1) Description

The task is based on the description. We have a mechanism to automatically generate sub-tasks from a high-level task description. For example, a task to track AI industry developments might be described as: "Research latest news in Anthropic and OpenAI." This will be turned into smaller, more manageable tasks ("Make a Google search for Anthropic and OpenAI," "Read the blogs," "Read the official descriptions of Anthropic and OpenAI").

# Define a new Task
description = "Research latest news in Anthropic and OpenAI"

2) Response Format

The output is essential for deploying an AI agent across apps or as a service. In Upsonic, we use Pydantic BaseClass as input for the task system. This allows you to configure the output exactly how you want it, such as a list of news with title, body, and URL. You can create a flexible yet robust output mechanism that improves interoperability between the agent and your app.

# Example ObjectResponse usage
class News(ObjectResponse):
    title: str
    body: str
    url: str
    tags: list[str]

class ResponseFormat(ObjectResponse):
    news_list: list[News]

3) Tool Integration

Our Framework officially supports Model Context Protocol (MCP) and custom tools. You can use hundreds of MCP servers at https://glama.ai/mcp/servers or https://smithery.ai/ We also support Python functions inside a class as a tool. You can easily generate your integrations with that.

@client.mcp()
class HackerNewsMCP:
    command = "uvx"
    args = ["mcp-hn"]

@client.tool()
class MyTools:
    def our_server_status():
        return True

tools = [Search, MyTools] # HackerNewsMCP

4) Task Defination

After defining these terms, you are ready to generate your first task. This structure is a key component of the Upsonic task-oriented structure. Once you define a task, you can run it with agents or directly via an LLM call to obtain the result over the Task object. The automatic sub-task mechanism is also essential for enhancing quality and precision.

This simplicity is a hallmark of Upsonic.

task1 = Task(description=description, response_format=ResponseFormat, tools=tools)


Defining an Agent

Agents are the standard way to configure an LLM for your employees to work on your requests. It is essential to consider the goals and context of tasks. In Upsonic, we have an automatic characterization mechanism that enriches the given information by researchers agents working on Upsonic. For example, a Product Manager Agent can be configured with job title, company URL, and company objectives. Representing agents as roles like it supports practical agents aligned with their unique objectives.

product_manager_agent = AgentConfiguration(
    job_title="Product Manager",
    company_url="https://upsonic.ai",
    company_objective="To build AI Agent framework that helps people get things done",
)


Running Tasks

Define the task and the agent, then combine them and run. The Upsonic Server will prepare and run the task. This standard method simplifies the use of agents in your SaaS applications or your new vertical AI agents. 🤖 You are now completely ready to run your first agent.

client.agent(product_manager_agent, task1)

result = task1.response

for i in result.news_list:
    print()
    print("News")
    print("Title: ", i.title)
    print("Body: ", i.body)
    print("URL: ", i.url)
    print("Tags: ", i.tags)


Other Features (Beta)

Only One LLM Call

LLMs have always been intelligent. We know exactly when to call an agent or an LLM. This creates a smooth transition from LLM to agent systems. The call method works like an agent, based on tasks and optimizing cost and latency for your requirements. Focus on the task. Don't waste time with complex architectures.

client.call(task1)

Memory

Humans have an incredible capacity for context length, which reflects their comprehensive context awareness and consistently produces superior results. In Upsonic, our memory system adeptly handles complex workflows, delivering highly personalized outcomes. It seamlessly remembers prior tasks and preferences, ensuring optimal performance. You can confidently set up memory settings within AgentConfiguration, leveraging the agent_id system. Agents, each with their distinct personality, are uniquely identified by their ID, ensuring precise and efficient execution.

agent_id_ = "product_manager_agent"

product_manager_agent = AgentConfiguration(
    agent_id_=agent_id_
    ...
    memory=True
)

Knowledge Base

The Knowledge Base provides private or public content to your agent to ensure accurate and context-aware tasks. For example, you can provide a PDF and URL to the agent. The Knowledge Base seamlessly integrates with the Task System, requiring these sources.

from upsonic import KnowledgeBase

my_knowledge_base = KnowledgeBase(files=["sample.pdf", "<https://upsonic.ai>"])

task1 = Task(
    ...
    context[my_knowledge_base]
)

Connecting Task Outputs

Chaining tasks is essential for complex workflows where one task's output informs the next. You can assign a task to another as context for performing the job. This will prepare the response of task 1 for task 2.

task1 = Task(
    ...
)

task2 = Task(
    ...
    context[task1]
)

Be an Human

Agent and characterization are based on LLM itself. We are trying to characterize the developer, PM, and marketing. Sometimes, we need to give a human name. This is required for tasks like sending personalized messages or outreach. For these requirements, we have name and contact settings in AgentConfiguration. The agent will feel like a human as you specify.

product_manager_agent = AgentConfiguration(
    ...
    name="John Walk"
    contact="john@upsonic.ai"
)

Multi Agent

Distribute tasks effectively across agents with our automated task distribution mechanism. This tool matches tasks based on the relationship between agent and task, ensuring collaborative problem-solving across agents and tasks.

client.multi_agent([agent1, agent2], [task1, task2])

Reliable Computer Use

Computer use can able to human task like humans, mouse move, mouse click, typing and scrolling and etc. So you can build tasks over non-API systems. It can help your linkedin cases, internal tools. Computer use is supported by only Claude for now.

from upsonic.client.tools import ComputerUse

...

tools = [ComputerUse]
...

Reflection

LLM's by their nature oriented to finish your process. By the way its mean sometimes you can get empty result. Its effect your business logic and your application progress. We support reflection mechanism for that to check the result is staisfying and if not give a feedback. So you can use the reflection for preventing blank messages and other things.

product_manager_agent = AgentConfiguration(
    ...
    reflection=True
)

Compress Context

The context windows can be small as in OpenAI models. In this kind of situations we have a mechanism that compresses the message, system_message and the contexts. If you are working with situations like deepsearching or writing a long content and giving it as context of another task. The compress_context is full fit with you. This mechanism will only work in context overflow situations otherwise everything is just normal.

product_manager_agent = AgentConfiguration(
    ...
    compress_context=True
)


Telemetry

We use anonymous telemetry to collect usage data. We do this to focus our developments on more accurate points. You can disable it by setting the UPSONIC_TELEMETRY environment variable to false.

import os
os.environ["UPSONIC_TELEMETRY"] = "False"


Coming Soon

  • Dockerized Server Deploy
  • Verifiers For Computer Use

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

upsonic-0.36.0a1737496270.tar.gz (183.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

upsonic-0.36.0a1737496270-py3-none-any.whl (64.3 kB view details)

Uploaded Python 3

File details

Details for the file upsonic-0.36.0a1737496270.tar.gz.

File metadata

  • Download URL: upsonic-0.36.0a1737496270.tar.gz
  • Upload date:
  • Size: 183.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.5.21

File hashes

Hashes for upsonic-0.36.0a1737496270.tar.gz
Algorithm Hash digest
SHA256 a515d6f86bfb8b8d62bbe5e84e209f7651e430e9a64f87cee4360ef702267eb7
MD5 4be0f0616c665e079a86926ab7901b9f
BLAKE2b-256 a8b3728c1c2e8df722260921d253399f23601c2d37520d6e28af76a805fdc0e0

See more details on using hashes here.

File details

Details for the file upsonic-0.36.0a1737496270-py3-none-any.whl.

File metadata

File hashes

Hashes for upsonic-0.36.0a1737496270-py3-none-any.whl
Algorithm Hash digest
SHA256 b5da09837493c0b2404569f9694fc4f39782eaf6a3337ece2838e40a223352dc
MD5 0585370d16aec13a20778ec1d3cd9e6a
BLAKE2b-256 ea24c42a8363e279a1c0178bfd9d7b267715b46fd7f0f40496992ab10334367c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page