Skip to main content

Convert any LangChain Chat Model into a Tool Calling LLM

Project description

Tool Calling LLM

Tool Calling LLM is a python mixin that lets you add tool calling capabilities effortlessly to LangChain's Chat Models that don't yet support tool/function calling natively. Simply create a new chat model class with ToolCallingLLM and your favorite chat model to get started.

With ToolCallingLLM you also get access to the following functions:

  1. .bind_tools() allows you to bind tool definitions with a llm.
  2. .with_structured_output() allows you to return structured data from your model. This is now being provided by LangChain's BaseChatModel class.

At this time, ToolCallingLLM has been tested to work with ChatOllama, ChatNVIDIA, and ChatLiteLLM with Ollama provider.

The OllamaFunctions was the original inspiration for this effort. The code for ToolCallingLLM was abstracted out of OllamaFunctions to allow it to be reused with other non tool calling Chat Models.

Installation

pip install --upgrade tool_calling_llm

Usage

Creating a Tool Calling LLM is as simple as creating a new sub class of the original ChatModel you wish to add tool calling features to.

Below sample code demonstrates how you might enhance ChatOllama chat model from langchain-ollama package with tool calling capabilities.

from tool_calling_llm import ToolCallingLLM
from langchain_ollama import ChatOllama
from langchain_community.tools import DuckDuckGoSearchRun


class OllamaWithTools(ToolCallingLLM, ChatOllama):
    def __init__(self, **kwargs):
        super().__init__(**kwargs)

    @property
    def _llm_type(self):
        return "ollama_with_tools"


llm = OllamaWithTools(model="llama3.1",format="json")
tools = [DuckDuckGoSearchRun()]
llm_tools = llm.bind_tools(tools=tools)

llm_tools.invoke("Who won the silver medal in shooting in the Paris Olympics in 2024?")

This yields output as follows:

AIMessage(content='', id='run-9c3c7a78-97af-4d06-835e-aa81174fd7e8-0', tool_calls=[{'name': 'duckduckgo_search', 'args': {'query': 'Paris Olympics 2024 shooting silver medal winner'}, 'id': 'call_67b06088e208482497f6f8314e0f1a0e', 'type': 'tool_call'}])

For more comprehensive examples, refer to ToolCallingLLM-Tutorial.ipynb jupyter notebook.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tool_calling_llm-0.1.2.tar.gz (6.7 kB view details)

Uploaded Source

Built Distribution

tool_calling_llm-0.1.2-py3-none-any.whl (7.5 kB view details)

Uploaded Python 3

File details

Details for the file tool_calling_llm-0.1.2.tar.gz.

File metadata

  • Download URL: tool_calling_llm-0.1.2.tar.gz
  • Upload date:
  • Size: 6.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.6

File hashes

Hashes for tool_calling_llm-0.1.2.tar.gz
Algorithm Hash digest
SHA256 b558d0229b6cee840ca3b123673067c2c5fa9323590ed1dcbb5e2c06432c5d8a
MD5 cf55aaf5b7dce14c802a0d901ff3f509
BLAKE2b-256 2abe0e7d3f4d75c49cfe672f408b5ca8e0bb24a4415570cd14c46e968958b5fb

See more details on using hashes here.

File details

Details for the file tool_calling_llm-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for tool_calling_llm-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 bc0aa3a1f9f522a0ca9d131e8241be6861233d5ebfb649a04ccf2a00f2ade50c
MD5 39e507b2d862eed7a62c4b432488a008
BLAKE2b-256 75bb21d50be4cb02e64e2700bc2dbe26a0646bf74534f0b11c858d5bbc2fefad

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page