Skip to main content

('A library for augmenting large language models',)

Project description

augllm

augllm is a wrapper library for operating Augmented Large Language Models (LLMs) using Ollama.
It provides an interface for utilizing external tools via Function Calling.
Note that the actual implementations of the tools are not included—users are expected to integrate their own external implementations as needed.

Repository: https://github.com/ToPo-ToPo-ToPo/augllm


Table of Contents

  1. Features / Overview
  2. Requirements
  3. Installation
  4. Usage
    • Sample Programs
    • Integration with Function Calling
  5. License

1. Features / Overview

  • Interact with LLMs (either local or cloud-based) through Ollama
  • Support for tool integration using Function Calling
  • Tools are defined as abstract interfaces; concrete implementations (e.g., API calls, local script execution) can be freely developed by the user
  • Designed with extensibility in mind: easy integration with custom tools, chaining, and prompt engineering

2. Requirements

  • Python 3.11 or higher
  • An environment where Ollama CLI or API client is available

3. Installation

  1. Create and activate a virtual environment
python -m venv env

On macOS, activate the virtual environment:

source env/bin/activate
  1. Install the library
pip install augllm

4. Usage

Sample Programs

A test/ directory is included in the repository.
Please refer to the two files inside as examples.

Integration with Function Calling

  1. Provide function signatures in the prompt that represent expected tool calls
  2. Receive the function call request returned by the model (tool name + arguments)
  3. Invoke the corresponding tool interface’s run(...) method and obtain the result
  4. Pass the result back to the model to obtain the final response

5. License

This project is licensed under the Apache-2.0 License.
See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

augllm-1.2.tar.gz (20.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

augllm-1.2-py3-none-any.whl (24.4 kB view details)

Uploaded Python 3

File details

Details for the file augllm-1.2.tar.gz.

File metadata

  • Download URL: augllm-1.2.tar.gz
  • Upload date:
  • Size: 20.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.7

File hashes

Hashes for augllm-1.2.tar.gz
Algorithm Hash digest
SHA256 9dfd70a1bf50564b0c0a967a7571c0fee1317e8477c28ab4a6f1760a45244013
MD5 b096cf0ff95bbfde4b31a7e14724b73e
BLAKE2b-256 11a31e192a40a1a9938b69c15733d84464b43d3bfbb163ba71e53cc9e7c268bc

See more details on using hashes here.

File details

Details for the file augllm-1.2-py3-none-any.whl.

File metadata

  • Download URL: augllm-1.2-py3-none-any.whl
  • Upload date:
  • Size: 24.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.7

File hashes

Hashes for augllm-1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 cfa016addf61b74f203fbfe52ef70b4f06fd91976e6e99b042ff34730abaf719
MD5 99b611c6eda06121dbd36c4505351f54
BLAKE2b-256 ccd68deb95ab1bdf4c75075f5bcd2d990ddaeb2726a82e49a9c8bf150026a902

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page