Skip to main content

A Python package for interacting with OpenAI-compatible LLMs

Project description

VinehooLLM

A Python package for interacting with OpenAI-compatible Language Models (LLMs) with function/tool calling capabilities.

English | 中文

English

Overview

VinehooLLM is a Python client library that provides a simple interface for interacting with OpenAI-compatible Language Models. It supports modern features like function/tool calling and is designed to be easy to use while maintaining flexibility.

Features

  • OpenAI-compatible API support
  • Function/tool calling capabilities
  • Type-safe implementation using Pydantic models
  • Automatic function execution handling
  • Customizable API endpoints
  • Comprehensive error handling

Installation

pip install vinehoollm

Publishing to PyPI

To publish the package to PyPI, follow these steps:

  1. Install build and twine:
pip install build twine
  1. Update version in setup.py or pyproject.toml

  2. Build the package:

python -m build  # This modern command replaces the legacy "python setup.py sdist bdist_wheel"
  1. Upload to PyPI:
# Test PyPI (recommended for testing)
python -m twine upload --repository testpypi dist/*

# Official PyPI
python -m twine upload dist/*

Quick Start

from vinehoollm.client import VinehooLLM, ChatMessage

# Initialize the client
client = VinehooLLM(
    api_key="your-api-key",
    model="gpt-4o-mini"  # or any other compatible model
)

# Simple chat completion
messages = [
    ChatMessage(role="user", content="Hello, how are you?")
]
response = client.chat(messages)
print(response.text)

# Using function/tool calling
def calculator(a: int, b: int, operation: str) -> float:
    if operation == "add":
        return a + b
    elif operation == "multiply":
        return a * b
    # ... other operations

# Define the tool
calculator_tool = {
    "type": "function",
    "function": {
        "name": "calculator",
        "description": "Perform basic arithmetic operations",
        "parameters": {
            "type": "object",
            "properties": {
                "a": {"type": "integer"},
                "b": {"type": "integer"},
                "operation": {
                    "type": "string",
                    "enum": ["add", "subtract", "multiply", "divide"]
                }
            },
            "required": ["a", "b", "operation"]
        }
    }
}

# Initialize client with tool
client_with_tools = VinehooLLM(
    api_key="your-api-key",
    model="gpt-4o-mini",
    tools=[calculator_tool],
    tool_handlers={"calculator": calculator}
)

中文

项目概述

VinehooLLM 是一个用于与 OpenAI 兼容的语言模型进行交互的 Python 客户端库。它支持函数/工具调用等现代特性,设计简单易用,同时保持灵活性。

特性

  • 支持 OpenAI 兼容的 API
  • 支持函数/工具调用功能
  • 使用 Pydantic 模型实现类型安全
  • 自动处理函数执行
  • 可自定义 API 端点
  • 全面的错误处理

安装

pip install vinehoollm

发布到 PyPI

按照以下步骤将包发布到 PyPI:

  1. 安装 build 和 twine:
pip install build twine
  1. 更新 setup.pypyproject.toml 中的版本号

  2. 构建包:

python -m build
  1. 上传到 PyPI:
# 测试版 PyPI(建议先测试)
python -m twine upload --repository testpypi dist/*

# 正式版 PyPI
python -m twine upload dist/*

快速开始

from vinehoollm.client import VinehooLLM, ChatMessage

# 初始化客户端
client = VinehooLLM(
    api_key="你的API密钥",
    model="gpt-4o-mini"  # 或其他兼容的模型
)

# 简单的聊天完成
messages = [
    ChatMessage(role="user", content="你好,最近如何?")
]
response = client.chat(messages)
print(response.text)

# 使用函数/工具调用
def calculator(a: int, b: int, operation: str) -> float:
    if operation == "add":
        return a + b
    elif operation == "multiply":
        return a * b
    # ... 其他操作

# 定义工具
calculator_tool = {
    "type": "function",
    "function": {
        "name": "calculator",
        "description": "执行基本的算术运算",
        "parameters": {
            "type": "object",
            "properties": {
                "a": {"type": "integer"},
                "b": {"type": "integer"},
                "operation": {
                    "type": "string",
                    "enum": ["add", "subtract", "multiply", "divide"]
                }
            },
            "required": ["a", "b", "operation"]
        }
    }
}

# 使用工具初始化客户端
client_with_tools = VinehooLLM(
    api_key="你的API密钥",
    model="gpt-4o-mini",
    tools=[calculator_tool],
    tool_handlers={"calculator": calculator}
)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vinehoollm-0.1.2.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

vinehoollm-0.1.2-py3-none-any.whl (6.5 kB view details)

Uploaded Python 3

File details

Details for the file vinehoollm-0.1.2.tar.gz.

File metadata

  • Download URL: vinehoollm-0.1.2.tar.gz
  • Upload date:
  • Size: 6.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.21

File hashes

Hashes for vinehoollm-0.1.2.tar.gz
Algorithm Hash digest
SHA256 a4b71a56585fd9190e5e7c884599192ff58f7c8395cb67c759b97e394fb10fdf
MD5 428d2b6e866c7959510257f6c142d33a
BLAKE2b-256 43954663ab904d9ebd1074b8ce44cd7e4de1299f7feef9bf4a333f45baaa89f0

See more details on using hashes here.

File details

Details for the file vinehoollm-0.1.2-py3-none-any.whl.

File metadata

  • Download URL: vinehoollm-0.1.2-py3-none-any.whl
  • Upload date:
  • Size: 6.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.21

File hashes

Hashes for vinehoollm-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 a7069956634f3144d075d210b6dd7ebfc4226193ce97c469b36078181b5d755b
MD5 7b94cd834d3a229ec7fe9796d4728e0c
BLAKE2b-256 c0bf077641fc45a7872b4c1d9d0f4b2dbd1f27c16ff4fef5b7cb28005046d0e5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page