Skip to main content

The official Python SDK for Toolhouse API

Project description

Toolhouse Python SDK

This is the Python SDK for Toolhouse.

Toolhouse allows you to unlock the best LLM knowledge and actions. It works across a wide ranges or LLMs and providers.

With Toolhouse, you can install tools from the Tool Store and execute them in the cloud, without the need to handling their execution locally.

For more details, you can check out our documentation.

Installation

With pip:

pip install toolhouse

With poetry:

poetry add toolhouse

Getting started

In order to use the SDK, you will need a Toolhouse API key. To get the API key:

  1. Sign up for Toolhouse or log in if you are an existing Toolhouse user.
  2. Go to your user ➡️ API Keys (direct link)
  3. Give your API key a name and click Generate.

Copy the API Key and save it where you save your secrets. We'll assume you have a .env file.

We suggest saving your API Key as TOOLHOUSE_API_KEY in your environment file. This allows Toolhouse to pick up its value directly in your code.

TOOLHOUSE_API_KEY=<Your API Key value>

Alternatively, you can set the API key when you initialize the SDK. You can do this in the constructor:

tools = Toolhouse('YOUR_API_KEY')

You can also use the set_access_token method:

tools = Toolhouse()
tools.set_access_token('YOUR_API_KEY')

Our Quick start guide has all you need to get you set up quickly.

Providers

Toolhouse works with the widest possible range of LLMs across different providers. By default, the Toolhouse API will work with any LLM that is compatible with the OpenAI chat completions API.

You can switch providers when initializing the SDK through the constructor:

from toolhouse import Toolhouse, Provider
tools = Toolhouse(provider=provider.ANTHROPIC)

If you are passing your API key:

from toolhouse import Toolhouse, Provider
tools = Toolhouse('YOUR_API_KEY', provider.ANTHROPIC)

Sample usage

In this example, we'll use the OpenAI SDK as well as dotenv.

pip install python-dotenv

Create a .env and add your API keys there.

TOOLHOUSE_API_KEY=
OPENAI_API_KEY=

Head over to Toolhouse and install the Current time tool.

import os
from dotenv import load_dotenv
from toolhouse import Toolhouse
from openai import OpenAI
from typing import List

load_dotenv()

client = OpenAI()
tools = Toolhouse()

#Metadata to convert UTC time to your localtime
th.set_metadata("timezone", -7)

messages: List = [{
    "role": "user",
    "content": "What's the current time?"
}]

response = client.chat.completions.create(
    model='gpt-4o',
    messages=messages,
    tools=tools.get_tools(),
    tool_choice="auto"
)

messages += th.run_tools(response)

response = client.chat.completions.create(
            model="gpt-4o-mini",
            messages=messages,
            tools=tools.get_tools(),
            tool_choice="auto"
        )
print(response.choices[0].message.content)

Use Local Tools

To utilize a local tool, you need to define the tool, its JSON schema, and register it with the Toolhouse SDK. Here's a step-by-step guide:

  1. Create the local tool function.
  2. Define the JSON schema for the local tool function.
  3. Register the local tool with the Toolhouse SDK.
  4. Add the local tool to the messages.
  5. Utilize the local tool within the Toolhouse SDK.

Here is a sample code:

"""OpenAI Sample"""
import os
from typing import List
from dotenv import load_dotenv
from openai import OpenAI
from toolhouse import Toolhouse
load_dotenv()

TOKEN = os.getenv("OPENAI_KEY")
TH_TOKEN = os.getenv("TOOLHOUSE_BEARER_TOKEN")


local_tools = [
    {'type': 'function',
     'function':
         {
             'name': 'hello',
             'description': 'The user receives a customized hello message from a city and returns it to the user.', 
             'parameters': {
                 'type': 'object',
                 'properties': {
                     'city': {'type': 'string', 'description': 'The city where you are from'}
                 }},
             'required': ['city']
         }}]

th = Toolhouse(access_token=TH_TOKEN, provider="openai")
th.set_metadata("id", "fabio")
th.set_metadata("timezone", 5)


@th.register_local_tool("hello")  # the name used to register the tool should be the same as the name in the json schema
def hello_tool(city: str):
    """Return a Hello message from a specific city."""
    return f"Hello from {city}!!!"


client = OpenAI(api_key=TOKEN)

messages: List = [{
    "role": "user",
    "content":
        "Can I get a hello from Rome?"
    }]

response = client.chat.completions.create(
    model='gpt-4o',
    messages=messages,
    tools=th.get_tools() + local_tools
)

messages += th.run_tools(response)

response = client.chat.completions.create(
            model="gpt-4o",
            messages=messages,
            tools=th.get_tools() + local_tools
        )

print(response.choices[0].message.content)

Use Bundles

Bundles help you define groups of tools you want to pass to the LLM based on specific contextual need. For example, if you want to enhance your LLM's knowledge with live stock market data, you can create a Bundle with a stock price API call, a RAG for stock news, and summarization of SEC filings. In order to create a bundle, you need to:

  1. Go to the Tool Store - Bundles and create a new bundle. eg: "stock_bundle"
  2. Add tools to the bundle.
  3. Use the bundle in the Toolhouse SDK.

Here is a sample code:

import os
from typing import List
from dotenv import load_dotenv
from openai import OpenAI
from toolhouse import Toolhouse
load_dotenv()

TOKEN = os.getenv("OPENAI_KEY")
TH_TOKEN = os.getenv("TOOLHOUSE_BEARER_TOKEN")

th = Toolhouse(access_token=TH_TOKEN, provider="openai")
th.set_metadata("id", "fabio")  # metadata is optional based on the tools you are using
th.set_metadata("timezone", 5)  # metadata is optional based on the tools you are using

client = OpenAI(api_key=TOKEN)

messages: List = [{
    "role": "user",
    "content": "What is the stock price of Apple?"
    }]

response = client.chat.completions.create(
    model='gpt-4o',
    messages=messages,
    tools=th.get_tools(bundle="stock_bundle")
)

Contributing

We welcome pull requests that add meaningful additions to these code samples, particularly for issues that can expand compability.

You can submit issues (for example for feature requests or improvements) by using the Issues tab.

Publishing tools

Developers can also contribute to Toolhouse by publishing tools for the Tool Store. The Tool Store allows developers to submit their tools and monetize them every time they're executed. Developers and tools must go through an review and approval process, which includes adhering to the Toolhouse Privacy and Data Protection policy. If you're interested in becoming a publisher, submit your application.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

toolhouse-1.2.0.tar.gz (28.3 kB view details)

Uploaded Source

Built Distribution

toolhouse-1.2.0-py3-none-any.whl (37.5 kB view details)

Uploaded Python 3

File details

Details for the file toolhouse-1.2.0.tar.gz.

File metadata

  • Download URL: toolhouse-1.2.0.tar.gz
  • Upload date:
  • Size: 28.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.8.19

File hashes

Hashes for toolhouse-1.2.0.tar.gz
Algorithm Hash digest
SHA256 381b18418df43cc49622fb1d2b28bf7b8e0d5e47ffd771aa92de650570a487e2
MD5 5d784ff4ee4067ba448b3152095cab18
BLAKE2b-256 f44b9237012048950c02006d8f8e3904d60b648ee0177fd123e6f812f72050e4

See more details on using hashes here.

File details

Details for the file toolhouse-1.2.0-py3-none-any.whl.

File metadata

  • Download URL: toolhouse-1.2.0-py3-none-any.whl
  • Upload date:
  • Size: 37.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.8.19

File hashes

Hashes for toolhouse-1.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3ba21ef09708896cfd9c87de0f0ff46b8816ba513382044483a63a6bdaf103fd
MD5 3a8febcfd5a1c5e0c867bd826cd1d589
BLAKE2b-256 7c07754139792cfe3cb34b56a3f59d1577c0d6c73d045e5ee6c6a7efcd7dc301

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page