Skip to main content

The quickest way to make Large Language Models _do_ things.

Project description

interfaces.to

🐙 interfaces.to - Add a little Action to your LLM Adventure

interfaces.to (aka into) is the quickest way to make Large Language Models do things. It's a simple, powerful and flexible way to build more useful, more engaging and more valuable applications with LLMs.

✨ Key Features

⭐️ Built-in tools for common tasks and platforms
⭐️ Start building with just 4(!) lines of code
⭐️ Developer-friendly Python library
⭐️ Extensible design for custom tools
⭐️ Simple and secure configuration
⭐️ Fully compatible with the OpenAI API SDK
⭐️ Works with gpt-4o, gpt-4o-mini and other OpenAI models
⭐️ Works with llama3.1, mistral-large and more via ollama tools
⭐️ Supports (thrives) on parallel_tool_calls
⭐️ Works with any LLM that supports the OpenAI API
⭐️ Runs on your local machine, in the cloud, or on the edge
⭐️ Open-source, MIT licensed, and community-driven

🚀 Quick Start

Installation

Install with pip:

pip install interfaces-to

or

Install with poetry:

poetry add interfaces-to

Usage

Add your favourite tools to your existing Python project with 4 lines of code:

from openai import OpenAI
client = OpenAI()

# 1️⃣ import `into`
import interfaces_to as into

# 2️⃣ add your favourite tools
tools = [*into.Slack(functions=["send_slack_message"])]

# 3️⃣ provide some input and start the loop
messages = [{"role": "user", "content": "Introduce yourself in #general and make a joke in #random"}]
while into.running(messages):

  # 4️⃣ create a completion as normal, and run your tools! 🪄
  completion = client.chat.completions.create(
    model="gpt-4o",
    messages=messages,
    tools=tools,
    tool_choice="auto"
  )
  messages = into.run(messages, completion, tools)

# 5️⃣ stand back and watch the magic happen! 🎩✨
print(messages)

This prints the following output:

[
    {
        'role': 'user', 
        'content': 'Introduce yourself in #general, make a joke in #random'
    }, 
    {
        'role': 'assistant', 
        'content': None, 
        'tool_calls': [
            {
                'id': 'call_135kkxcrN4OxAQXZinlXGYb', 
                'type': 'function', 
                'function': {
                    'name': 'send_slack_message', 
                    'arguments': '{"channel": "#general", "message": "Hi everyone! I\'m an assistant here to help with tasks and answer questions. Excited to work with you all!"}'
                }
            }, 
            {
                'id': 'call_08ApGj5q8ZyRLuHo10y4FKb', 
                'type': 'function', 
                'function': {
                    'name': 'send_slack_message', 
                    'arguments': '{"channel": "#random", "message": "Why don\'t scientists trust atoms? Because they make up everything!"}'
                }
            }
        ]
    }, 
    {
        'role': 'tool', 
        'content': "Posted message to #general: Hi everyone! I'm an assistant here to help with tasks and answer questions. Excited to work with you all!", 
        'tool_call_id': 'call_135kkxcrN4OxAQXZinlXGYb'
    }, 
    {
        'role': 'tool', 
        'content': "Posted message to #random: Why don't scientists trust atoms? Because they make up everything!", 
        'tool_call_id': 'call_08ApGj5q8ZyRLuHo10y4FKb'
    }, 
    {
        'role': 'assistant', 
        'content': "I've introduced myself in #general and shared a joke in #random! Let me know if you need anything else."
    }
]

Configurating tools

Tools usually require a token. Tokens can always be configured by setting the relevant environment variables. e.g. for Slack you can set the SLACK_BOT_TOKEN environment variable.

If you prefer to set the token directly in your code, you can do so by passing it as an argument to the tool. Tokens provided in code will override any environment variables.

You can optionally restrict functions to only those which you need.

Here's an example of configuring the Slack tool:

import interfaces_to as into

slack = into.Slack(
    token="xoxb-12345678-xxxxxxxxxx",
    functions=["send_slack_message"]
)

📦 Available tools

into comes with loads of pre-built tools to help you get started quickly. These tools are designed to be simple, powerful and flexible, and can be used in any combination to create a wide range of applications.

  • Slack: send_slack_message (Send a message to a Slack channel)
  • OpenAI: create_chat_completion (Create a completion with the OpenAI API)

Possibly coming soon:

  • GitHub:
    • create_issue: Create an issue on GitHub
    • create_pull_request: Create a pull request on GitHub
    • create_gist: Create a gist on GitHub
    • create_repository: Create a repository on GitHub
    • commit_to_repository: Commit changes to a repository on GitHub
  • Discord: send_discord_message (Send a message to a Discord channel)
  • Telegram: send_telegram_message (Send a message to a Telegram chat)
  • Facebook Messenger: send_facebook_message (Send a message to a Facebook Messenger chat)
  • WhatsApp: send_whatsapp_message (Send a message to a WhatsApp chat)
  • X: post_to_x (Post on X)
  • Reddit: post_to_reddit (Post on Reddit)
  • Twilio: send_sms (Send an SMS message)
  • SendGrid: send_email (Send an email)
  • Mailgun: send_email (Send an email)
  • Google Sheets: write_to_sheet (Write data to a Google Sheet)
  • Google Drive: upload_file (Upload a file to Google Drive)
  • Google Calendar: create_event (Create an event on Google Calendar)
  • Google Maps: get_directions (Get directions between two locations)
  • Google Search: search (Search the web)
  • Wikipedia: search (Search Wikipedia)
  • Weather: get_weather (Get the current weather)

📚 Documentation (coming soon!)

For more information, check out the detailed documentation.

💬 Community

Join the interfaces.to Slack to chat with other LLM adventurers, ask questions, and share your projects.

🤝 Contributors (coming soon!)

We welcome contributions from the community! Please see our contributing guide (coming soon) for more information.

Notable contributors and acknowledgements:

@blairhudson • 🐙

🫶 License

This project is licensed under the MIT License - see the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

interfaces_to-0.1.0.tar.gz (6.5 kB view hashes)

Uploaded Source

Built Distribution

interfaces_to-0.1.0-py3-none-any.whl (8.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page