Skip to main content

UnifAI Python SDK

Project description

unifai-sdk-py

unifai-sdk-py is the Python SDK for Unifai, an AI native platform for dynamic tools and agent to agent communication.

Installation

pip install unifai-sdk

Getting your Unifai API key

You can get your API key for free from Unifai.

There are two types of API keys:

  • Agent API key: for using toolkits in your own agents.

  • Toolkit API key: for creating toolkits that can be used by other agents.

Using tools

To use tools in your agents, you need an agent API key. You can get an agent API key for free at Unifai.

import unifai

tools = unifai.Tools(api_key='xxx')

Then you can pass the tools to any OpenAI compatible API. Popular options include:

  • OpenAI's native API: For using OpenAI models directly
  • Litellm: A library that provides a unified OpenAI compatible API to most LLM providers
  • OpenRouter: A service that gives you access to most LLMs through a single OpenAI compatible API

The tools will work with any API that follows the OpenAI function calling format. This gives you the flexibility to choose the best LLM for your needs while keeping your tools working consistently.

response = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"content": "Can you tell me what is trending on Google today?", "role": "user"}],
    tools=tools.get_tools(),
)

If the response contains tool calls, you can pass them to the tools.call_tools method to get the results. The output will be a list of messages containing the results of the tool calls that can be concatenated to the original messages and passed to the LLM again.

results = await tools.call_tools(response.choices[0].message.tool_calls)
messages.extend(results)
# messages can be passed to the LLM again now

Passing the tool calls results back to the LLM might get you more function calls, and you can keep calling the tools until you get a response that doesn't contain any tool calls. For example:

messages = [{"content": "Can you tell me what is trending on Google today?", "role": "user"}]
while True:
    response = client.chat.completions.create(
        model="gpt-4o",
        messages=messages,
        tools=tools.get_tools(),
    )
    messages.append(response.choices[0].message)
    results = await tools.call_tools(response.choices[0].message.tool_calls)
    if len(results) == 0:
        break
    messages.extend(results)

Using tools in MCP clients

We provide a MCP server to access tools in any MCP clients such as Claude Desktop.

The easiest way to run the server is using uv, see Instaling uv if you haven't installed it yet.

Then in your Claude Desktop config:

{
  "mcpServers": {
    "unifai-tools": {
      "command": "uvx",
      "args": [
        "--from",
        "unifai-sdk",
        "unifai-tools-mcp"
      ],
      "env": {
        "UNIFAI_AGENT_API_KEY": ""
      }
    }
  }
}

Now your Claude Desktop will be able to access all the tools in Unifai automatically.

Creating tools

Anyone can create dynamic tools in Unifai by creating a toolkit.

A toolkit is a collection of tools that are connected to the Unifai infrastructure, and can be searched and used by agents dynamically.

Initialize a toolkit client with your toolkit API key. You can get a toolkit API key for free at Unifai.

import unifai

toolkit = unifai.Toolkit(api_key='xxx')

Update the toolkit name and/or description if you need:

await toolkit.update_toolkit(name="Echo Slam", description="What's in, what's out.")

or running it as a synchronous method with asyncio.run():

asyncio.run(toolkit.update_toolkit(name="Echo Slam", description="What's in, what's out."))

Register action handlers:

@toolkit.action(
    action="echo",
    action_description='Echo the message',
    payload_description={"content": {"type": "string"}},
)
def echo(ctx: unifai.ActionContext, payload={}): # can be an async function too
    return ctx.Result(f'You are agent <{ctx.agent_id}>, you said "{payload.get("content")}".')

Note that payload_description can be any string or a dict that contains enough information for agents to understand the payload format. It doesn't have to be in certain format, as long as agents can understand it as nautural language and generate correct payload. Think of it as the comments and docs for your API, agents read it and decide what parameters to use.

Start the toolkit:

await toolkit.run()

or running it as a synchronous method with asyncio.run():

asyncio.run(toolkit.run())

Examples

You can find examples in the examples directory.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request. For major changes, please open an issue first to discuss what you would like to change.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

unifai_sdk-0.2.3.tar.gz (170.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

unifai_sdk-0.2.3-py3-none-any.whl (37.1 kB view details)

Uploaded Python 3

File details

Details for the file unifai_sdk-0.2.3.tar.gz.

File metadata

  • Download URL: unifai_sdk-0.2.3.tar.gz
  • Upload date:
  • Size: 170.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for unifai_sdk-0.2.3.tar.gz
Algorithm Hash digest
SHA256 76243e61bc14400939c962b365e29e2219d175a7106c5c759be5d7530c39faac
MD5 2baaab618cd6d4d3aee9f4391584c8a4
BLAKE2b-256 99b3b455cce4bdbf19d74d772a18086d18c7cc3b7940f46bfd82f14c6598e314

See more details on using hashes here.

Provenance

The following attestation bundles were made for unifai_sdk-0.2.3.tar.gz:

Publisher: publish-to-pypi.yml on unifai-network/unifai-sdk-py

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file unifai_sdk-0.2.3-py3-none-any.whl.

File metadata

  • Download URL: unifai_sdk-0.2.3-py3-none-any.whl
  • Upload date:
  • Size: 37.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.8

File hashes

Hashes for unifai_sdk-0.2.3-py3-none-any.whl
Algorithm Hash digest
SHA256 6a78041759b8499b01f688c0382ae009ffda6a3d4af2379ce8e17d99d58aed8f
MD5 7b62476e67c31ab6218c84216a3ec9c8
BLAKE2b-256 5b222f28db9f9c9dc3de43c57c2301d686bf16ce4c0d25e1c54ceba55afe030f

See more details on using hashes here.

Provenance

The following attestation bundles were made for unifai_sdk-0.2.3-py3-none-any.whl:

Publisher: publish-to-pypi.yml on unifai-network/unifai-sdk-py

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page