Skip to main content

A Python package for interacting with OpenAI-compatible LLMs

Project description

VinehooLLM

A Python package for interacting with OpenAI-compatible Language Models (LLMs) with function/tool calling capabilities.

English | 中文

English

Overview

VinehooLLM is a Python client library that provides a simple interface for interacting with OpenAI-compatible Language Models. It supports modern features like function/tool calling and is designed to be easy to use while maintaining flexibility.

Features

  • OpenAI-compatible API support
  • Function/tool calling capabilities
  • Type-safe implementation using Pydantic models
  • Automatic function execution handling
  • Customizable API endpoints
  • Comprehensive error handling

Installation

pip install vinehoollm

Publishing to PyPI

To publish the package to PyPI, follow these steps:

  1. Install build and twine:
pip install build twine
  1. Update version in setup.py or pyproject.toml

  2. Build the package:

python -m build  # This modern command replaces the legacy "python setup.py sdist bdist_wheel"
  1. Upload to PyPI:
# Test PyPI (recommended for testing)
python -m twine upload --repository testpypi dist/*

# Official PyPI
python -m twine upload dist/*

Quick Start

from vinehoollm.client import VinehooLLM, ChatMessage

# Initialize the client
client = VinehooLLM(
    api_key="your-api-key",
    model="gpt-4o-mini"  # or any other compatible model
)

# Simple chat completion
messages = [
    ChatMessage(role="user", content="Hello, how are you?")
]
response = client.chat(messages)
print(response.text)

# Access the complete message history
print("\nComplete conversation history:")
print(VinehooLLM.format_messages_to_json(response.messages))

# Using function/tool calling
def calculator(a: int, b: int, operation: str) -> float:
    if operation == "add":
        return a + b
    elif operation == "multiply":
        return a * b
    # ... other operations

# Define the tool
calculator_tool = {
    "type": "function",
    "function": {
        "name": "calculator",
        "description": "Perform basic arithmetic operations",
        "parameters": {
            "type": "object",
            "properties": {
                "a": {"type": "integer"},
                "b": {"type": "integer"},
                "operation": {
                    "type": "string",
                    "enum": ["add", "subtract", "multiply", "divide"]
                }
            },
            "required": ["a", "b", "operation"]
        }
    }
}

# Initialize client with tool
client_with_tools = VinehooLLM(
    api_key="your-api-key",
    model="gpt-4o-mini",
    tools=[calculator_tool],
    tool_handlers={"calculator": calculator}
)

# Chat with tool/function calling
messages = [
    ChatMessage(role="user", content="What is 5 plus 3?")
]
response = client_with_tools.chat(messages)
print(response.text)

# Print the complete conversation including function calls
print("\nDetailed conversation with function calls:")
messages_json = VinehooLLM.format_messages_to_json(response.messages)
print(messages_json)

# Convert JSON back to ChatMessage objects
restored_messages = VinehooLLM.parse_messages_from_json(messages_json)
print("\nRestored messages:")
for msg in restored_messages:
    print(f"{msg.role}: {msg.content or '[No content]'}")
    if msg.tool_calls:
        print(f"Tool calls: {len(msg.tool_calls)}")

中文

项目概述

VinehooLLM 是一个用于与 OpenAI 兼容的语言模型进行交互的 Python 客户端库。它支持函数/工具调用等现代特性,设计简单易用,同时保持灵活性。

特性

  • 支持 OpenAI 兼容的 API
  • 支持函数/工具调用功能
  • 使用 Pydantic 模型实现类型安全
  • 自动处理函数执行
  • 可自定义 API 端点
  • 全面的错误处理

安装

pip install vinehoollm

发布到 PyPI

按照以下步骤将包发布到 PyPI:

  1. 安装 build 和 twine:
pip install build twine
  1. 更新 setup.pypyproject.toml 中的版本号

  2. 构建包:

python -m build
  1. 上传到 PyPI:
# 测试版 PyPI(建议先测试)
python -m twine upload --repository testpypi dist/*

# 正式版 PyPI
python -m twine upload dist/*

快速开始

from vinehoollm.client import VinehooLLM, ChatMessage

# 初始化客户端
client = VinehooLLM(
    api_key="你的API密钥",
    model="gpt-4o-mini"  # 或其他兼容的模型
)

# 简单的聊天完成
messages = [
    ChatMessage(role="user", content="你好,最近如何?")
]
response = client.chat(messages)
print(response.text)

# 访问完整的消息历史
print("\n完整的对话历史:")
print(VinehooLLM.format_messages_to_json(response.messages))

# 使用函数/工具调用
def calculator(a: int, b: int, operation: str) -> float:
    if operation == "add":
        return a + b
    elif operation == "multiply":
        return a * b
    # ... 其他操作

# 定义工具
calculator_tool = {
    "type": "function",
    "function": {
        "name": "calculator",
        "description": "执行基本的算术运算",
        "parameters": {
            "type": "object",
            "properties": {
                "a": {"type": "integer"},
                "b": {"type": "integer"},
                "operation": {
                    "type": "string",
                    "enum": ["add", "subtract", "multiply", "divide"]
                }
            },
            "required": ["a", "b", "operation"]
        }
    }
}

# 使用工具初始化客户端
client_with_tools = VinehooLLM(
    api_key="你的API密钥",
    model="gpt-4o-mini",
    tools=[calculator_tool],
    tool_handlers={"calculator": calculator}
)

# 包含Tool调用的聊天
messages = [
    ChatMessage(role="user", content="5加3等于多少?")
]
response = client_with_tools.chat(messages)
print(response.text)

# 打印包含函数调用的完整对话
print("\n包含函数调用的详细对话:")
messages_json = VinehooLLM.format_messages_to_json(response.messages)
print(messages_json)

# 将JSON转换回ChatMessage对象
restored_messages = VinehooLLM.parse_messages_from_json(messages_json)
print("\n恢复的消息:")
for msg in restored_messages:
    print(f"{msg.role}: {msg.content or '[无内容]'}")
    if msg.tool_calls:
        print(f"工具调用数量: {len(msg.tool_calls)}")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vinehoollm-0.2.0.tar.gz (9.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

vinehoollm-0.2.0-py3-none-any.whl (7.6 kB view details)

Uploaded Python 3

File details

Details for the file vinehoollm-0.2.0.tar.gz.

File metadata

  • Download URL: vinehoollm-0.2.0.tar.gz
  • Upload date:
  • Size: 9.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.21

File hashes

Hashes for vinehoollm-0.2.0.tar.gz
Algorithm Hash digest
SHA256 a06466e83ecad4dd1e2cce0b0aba0fb9feb084fd7b72c3641d81c39f6246ce2c
MD5 054c9f8b98b10decd293b946a8b651f3
BLAKE2b-256 ca4a34b545684cfd0845aa7acfbd0594b2c036357550a030034d215f7111dedf

See more details on using hashes here.

File details

Details for the file vinehoollm-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: vinehoollm-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 7.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.21

File hashes

Hashes for vinehoollm-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 feafab1ebf7ac96149edc069c12d78da203172616ca30c6a4f24bee4edd0619a
MD5 90f97ba4cf0358ca666b255458177a17
BLAKE2b-256 f9916faf63b6d764f3fa447170d72b8f89f3cf1d23831c8fd02ceee9500869b7

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page