Skip to main content

Biblioteca Python que simplifica e unifica a definição e chamada de ferramentas para grandes modelos de linguagem (LLMs). Compatível com Ollama, LangChain, OpenAI e outros frameworks.

Project description

llm-tool-fusion

Python License Version

🇧🇷 Português

📖 Descrição

llm-tool-fusion é uma biblioteca Python que simplifica e unifica a definição e chamada de ferramentas para grandes modelos de linguagem (LLMs). Compatível com frameworks populares que suportam tool calling, como Ollama, LangChain e OpenAI, ela permite integrar facilmente novas funções e módulos, tornando o desenvolvimento de aplicativos avançados de IA mais ágil e modular.

✨ Principais Recursos

  • 🔧 Unificação de APIs: Interface única para diferentes frameworks de LLM
  • 🚀 Integração Simplificada: Adicione novas ferramentas com facilidade
  • 🔗 Compatibilidade Ampla: Suporte para Ollama, LangChain, OpenAI e outros
  • 📦 Modularidade: Arquitetura modular para desenvolvimento escalável
  • Performance: Otimizado para aplicações em produção

🚀 Instalação

pip install llm-tool-fusion

📋 Uso Básico (Exemplo com OpenAI)

from llm_tool_fusion import ToolManager
from openai import OpenAI

client = OpenAI()

# Inicialize o gerenciador de ferramentas
manager = ToolManager()

# Adicione suas ferramentas personalizadas
@manager.tool
def multiply(numero1: int, numero2: int) -> int:
    """
    Multiplica doi numeros
    Args:
        numero1: int
        numero2: int
    Returns:
        int
    """

    return numero1 * numero2

response = client.responses.create(
    model="gpt-4.1",
    input=[{"role": "user", "content": "Quanto e 25 * 557 ?"}],
    tools=manager.get_tools()
)

print(response.output)
# Use com seu framework preferido
# Exemplo com OpenAI, LangChain, Ollama, etc.

🔧 Frameworks Suportados

  • OpenAI - API oficial e modelos GPT
  • LangChain - Framework completo para aplicações LLM
  • Ollama - Execução local de modelos
  • Anthropic Claude - API da Anthropic
  • E muito mais...

🤝 Contribuição

Contribuições são bem-vindas! Por favor:

  1. Faça um fork do projeto
  2. Crie uma branch para sua feature (git checkout -b feature/AmazingFeature)
  3. Commit suas mudanças (git commit -m 'Add some AmazingFeature')
  4. Push para a branch (git push origin feature/AmazingFeature)
  5. Abra um Pull Request

📄 Licença

Este projeto está licenciado sob a Licença MIT - veja o arquivo LICENSE para detalhes.


🇺🇸 English

📖 Description

llm-tool-fusion is a Python library that simplifies and unifies tool definition and calling for Large Language Models (LLMs). Compatible with popular frameworks that support tool calling, such as Ollama, LangChain, and OpenAI, it allows easy integration of new functions and modules, making advanced AI application development more agile and modular.

✨ Key Features

  • 🔧 API Unification: Single interface for different LLM frameworks
  • 🚀 Simplified Integration: Add new tools with ease
  • 🔗 Wide Compatibility: Support for Ollama, LangChain, OpenAI, and others
  • 📦 Modularity: Modular architecture for scalable development
  • Performance: Optimized for production applications

🚀 Installation

pip install llm-tool-fusion

📋 Basic Usage (Example with OpenAI)

from llm_tool_fusion import ToolManager
from openai import OpenAI

client = OpenAI()

# Inicialize o gerenciador de ferramentas
manager = ToolManager()

# Adicione suas ferramentas personalizadas
@manager.tool
def multiply(number1: int, number2: int) -> int:
    """
    Multiplies two numbers
    Args:
        number1: int
        number2: int
    Returns:
        int
    """

    return number1 * number2

response = client.responses.create(
    model="gpt-4.1",
    input=[{"role": "user", "content": "what is 25 * 557 ?"}],
    tools=manager.get_tools()
)

print(response.output)

# Use with your preferred framework
# Example with OpenAI, LangChain, Ollama, etc.

🔧 Supported Frameworks

  • OpenAI - Official API and GPT models
  • LangChain - Complete framework for LLM applications
  • Ollama - Local model execution
  • Anthropic Claude - Anthropic's API
  • And many more...

🤝 Contributing

Contributions are welcome! Please:

  1. Fork the project
  2. Create a feature branch (git checkout -b feature/AmazingFeature)
  3. Commit your changes (git commit -m 'Add some AmazingFeature')
  4. Push to the branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

📄 License

This project is licensed under the MIT License - see the LICENSE file for details.


🛠️ Development

Prerequisites

  • Python >= 3.12
  • pip or poetry for dependency management

Setup Development Environment

# Clone the repository
git clone https://github.com/caua1503/llm-tool-fusion.git
cd llm-tool-fusion

# Install dependencies
pip install -e .

# Run tests
python -m pytest

Project Structure

llm-tool-fusion/
├── llm_tool_fusion/
│   └── __init__.py
|   └── _core.py
|   └── _utils.py
│      
├── tests/
├── examples/
├── pyproject.toml
└── README.md

⭐ Se este projeto foi útil para você, considere dar uma estrela no GitHub!

⭐ If this project was helpful to you, consider starring it on GitHub!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_tool_fusion-0.1.0.tar.gz (8.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

llm_tool_fusion-0.1.0-py3-none-any.whl (9.2 kB view details)

Uploaded Python 3

File details

Details for the file llm_tool_fusion-0.1.0.tar.gz.

File metadata

  • Download URL: llm_tool_fusion-0.1.0.tar.gz
  • Upload date:
  • Size: 8.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.7

File hashes

Hashes for llm_tool_fusion-0.1.0.tar.gz
Algorithm Hash digest
SHA256 84fecd53deab90373328d83a03f684428bcb788871c556dbedc8c42e0c81bfab
MD5 b55c2738442294a51c413c39870d0052
BLAKE2b-256 04e341ec5672844add43bc629d3181e4d0e1e9ec127fd149a5db763345402548

See more details on using hashes here.

File details

Details for the file llm_tool_fusion-0.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_tool_fusion-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 3dbe2c46267d8fbf1dfef133a6546441f8cbf5b52945587b6700a5492da378fe
MD5 688725501a7df35de50b6f42ff32e8ba
BLAKE2b-256 245358a0a3714ac2b0b88577d25003e080c64872652c4771bc1da2215bd00284

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page