Skip to main content

No project description provided

Project description

Hype 🆙

[!WARNING]
This project is in early active development. Expect frequent updates and potential breaking changes.

Hype gives your Python functions super powers.

import hype
from pydantic import Field


@hype.up
def divide(
    x: int,
    y: int = Field(gt=0),
) -> int:
    """
    Divides one number by another.
    :param x: The numerator
    :param y: The denominator
    :return: The quotient
    """
    return x // y


divide(6, 2)  # => 3
divide.description # => 'Divides one number by another.'
divide.input.model_fields["x"].description # => The numerator
divide.json_schema  # "{'$defs': {'Input': { ... } }"

Installation

$ pip install git+https://github.com/mattt/hype.git

Call Python functions from AI assistants

Hyped up functions have tool definitions that you can pass to LLMs like Claude and ChatGPT.

For example, let's define a pair of functions to help answer a maths problem.

import ast
import operator
import re
from typing import Callable

import hype

@hype.up
def calculate(expression: str) -> int | float:
    """
    Performs basic arithmetic operations.
    Supports addition, subtraction, multiplication, and division, and exponentiation.

    :param expression: The mathematical expression to evaluate (e.g., '2 + 3 * 4').
                       This expression uses Python syntax.
    """
    ...

@hype.up
def prime_factors(n: int) -> set[int]:
    """
    Determines whether a number is prime.

    :param n: The number to check.
    """
    ...

Hyped up functions can be passed to hype.create_anthropic_tools to make them available as tools to Claude. You can set an optional result_type to shape the language model's output.

In this example, we pass the calculate and prime_factors functions and set result_type to bool to indicate that we want a final True or False answer.

import anthropic

client = anthropic.Anthropic()
tools = hype.create_anthropic_tools([calculate, prime_factors], result_type=bool)

messages: list[anthropic.types.MessageParam] = [
    {
        "role": "user",
        "content": "Is two to the power of 57 minus one prime?",
    }
]

for message in messages:
    print(message["content"])

while not tools.future.done():
    response = client.messages.create(
        model="claude-3-5-sonnet-20240620",
        max_tokens=2046,
        messages=messages,
        tools=tools,
    )

    for block in response.content:
        print(block)

    if response.stop_reason == "tool_use":
        messages.append({"role": "assistant", "content": response.content})
        for block in response.content:
            if block.type == "tool_use":
                messages.append(
                    {
                        "role": "user",
                        "content": [tools(block)],
                    }
                )
    else:
        break

result = tools.future.result()
print(result, type(result)) # => False, bool

Here's a transcript of the exchange:

👤 Human
Is two to the power of 88 minus one prime?
🤖 AI

To answer this question, we need to perform a couple of steps:

  1. Calculate 2^57 - 1
  2. Check if the resulting number is prime

Let's start with the calculation:

[tool_use name="calculate" input={'expression': '2**57 - 1'}]
👤 Human
[tool_use result=144115188075855871]
🤖 AI

Now that we have the result of 2^57 - 1, which is 144115188075855871, we need to check if this number is prime.

We can do this by finding its prime factors. If the number has only one prime factor (itself), then it is prime.

Let's use the prime_factors function to check:

[tool_use name="prime_factors" input={'n': 144115188075855871}]
👤 Human
[tool_use result=[7, 32377, 635879915089]]
🤖 AI

The prime_factors function returned [7, 32377, 635879915089], which means that 144115188075855871 (2^57 - 1) is not a prime number.

[tool_use name="__return__" input={'result': False}]

[!TIP] There are other Python libraries out there for working with chat assistants, but many wrap client libraries in ways that take control away from you.

Hype is designed with composability in mind, and doesn't get in your way.

Return typed outputs from AI assistants

Hype makes it easy to shape outputs from a chat assistant.

In the previous example, we showed how to use tools to answer a math question, but language models can do a lot on their own. Sometimes all you need is a way to get a particular kind of answer.

For instance, GPT-4o excels at extracting structured information from natural language text, like flight details from the body of an email.

First, define a FlightDetails class. Really go to town with Pydantic. The more precise and pedantic, the better your results will be.

from datetime import datetime
from typing import Annotated

from pydantic import BaseModel, Field, StringConstraints

AirportCode = Annotated[str, StringConstraints(min_length=3, max_length=3, pattern=r'^[A-Z]+$')]

class FlightDetails(BaseModel):
    origin: AirportCode = Field(
        description="Three-letter IATA airport code for the departure airport."
    )

    destination: AirportCode = Field(
        description="Three-letter IATA airport code for the arrival airport."
    )

    departure_time: datetime = Field(
        description="When the flight is scheduled to depart from its origin"
    )

    arrival_time: datetime = Field(
        description="When the flight is scheduled to arrive at its destination"
    )

    model_config = {
        "json_schema_extra": {
            "examples": [
                {
                    "origin": "LAX",
                    "destination": "JFK",
                    "departure_time": "2023-06-15T08:00:00Z",
                    "arrival_time": "2023-06-15T16:30:00Z"
                }
            ]
        }
    }

From there, the process is much the same as what we did before.

from anthropic import Anthropic
import hype

client = Anthropic()
tools = create_anthropic_tools(result_type=FlightDetails)

messages: list[MessageParam] = [
    {
        "role": "user",
        "content": """
            Extract the flight details from following email:


            It's time to check in for your flight.
            Use the app for smooth sailing and we'll see you soon!
            Confirmation code: ABCDEF

            Your trip details
            Flight 420
            Seat  10D
            5:00 PM            6:30 PM
            SFO                PDX
            San Francisco      Portland, OR

            Departure          Arrival
            9/20/2023          9/20/2023
            """,
    }
]

response = client.messages.create(
    model="claude-3-5-sonnet-20240620",
    max_tokens=2046,
    messages=messages,
    tools=tools,
)

if response.stop_reason == "tool_use":
    for block in response.content:
        if block.type == "tool_use":
            result = tools(block)

result = tools.future.result()
print(result.model_dump_json(indent=2))
{
  "origin": "SFO",
  "destination": "PDX",
  "departure_time": "2023-09-20T17:00:00Z",
  "arrival_time": "2023-09-20T18:30:00Z"
}

Roadmap

Examples

  • Basic tools use (calculator)
  • Output shaping (extracting flight information)
  • Web scraping
  • Generating an image with DALL-E / Replicate / FAL
  • Interacting with the local filesystem (using Ollama)

Integrations

  • AI Chat
    • Anthropic
    • OpenAI
    • Ollama
    • Gemini [^1]
    • Mistral
  • HTTP
  • WebSockets
  • CLI

[^1]: Gemini has enable_automatic_function_calling option, which provides similar functionality with different ergonomics.

Features

  • Protections
    • Budgets (time, money, etc.)
    • Resource entitlements (access to API keys)
    • Guardrails (safety checker, approval queues)
  • Concurrency (e.g. distributed task queue / worker pool)
  • Monitoring (telemetry, alerts, etc.)

Implementation details

  • Custom concurrent.futures.Executor subclass for chat models (Use Future correctly, record calls, enforce limits, reduce boilerplate)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hype_up-0.0.1.tar.gz (31.3 kB view details)

Uploaded Source

Built Distribution

hype_up-0.0.1-py3-none-any.whl (11.3 kB view details)

Uploaded Python 3

File details

Details for the file hype_up-0.0.1.tar.gz.

File metadata

  • Download URL: hype_up-0.0.1.tar.gz
  • Upload date:
  • Size: 31.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.14

File hashes

Hashes for hype_up-0.0.1.tar.gz
Algorithm Hash digest
SHA256 8d8345d63ab12f85c1cbe9f40c89fd50f7beee9226a25441a80d7c1226a8c4f5
MD5 b63cbfdbd081b21430005141164a79f0
BLAKE2b-256 ee56d5a71a58dbdd55d3855c34fbd8746452413bfaf2e9a1a7d3e9212d59a146

See more details on using hashes here.

File details

Details for the file hype_up-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: hype_up-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 11.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.14

File hashes

Hashes for hype_up-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 1948e732c49b1b3d34365945d8b15e61c93c5cde8caa2cb224bda90c2fd7e27b
MD5 c635c377f8aafc6870ad1cbb1738ce51
BLAKE2b-256 0b0decfbba1cedb3bd74b43e8d5747a11ab5782cf420af15935d2bc554eca173

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page