Skip to main content

Expose tools as lightweight CLI commands for LLM agents

Project description

โšก agenticli

Expose tools as lightweight CLI commands for LLM agents ๐Ÿ”งโœจ

Python License PyPI


โœจ What is this?

llmcli converts functions, classes, and schema-based tools into a stable CLI semantic layer. Instead of flooding prompts with large schemas, LLMs just output a single command string.

๐Ÿ’ก Philosophy: bash is everything. The command string is the most stable, restrained, and observable intermediate representation between LLMs and tool systems.

๐ŸŽฏ When to use llmcli?

Scenario llmcli helps?
You have many tools/functions and need a unified interface for LLMs โœ…
You don't want massive schemas injected into prompts โœ…
You want models to see minimal hints, expanding via --help โœ…
You want validation, help, errors, and lifecycle in one place โœ…

๐Ÿš€ Quick Start

pip install llmcli
from typing import Annotated
from llmcli import CommandRegistry, Option, command, command_group

@command_group(name="calc", description="Calculator commands")
class Calc:
    @command(name="add", description="Add numbers")
    def add(self,
        values: Annotated[list[float], Option(positional=True, value_name="n")]
    ) -> dict:
        return {"result": sum(values)}

registry = CommandRegistry()
registry.register(Calc)

# LLM sees this minimal prompt:
print(registry.get_llm_prompt())
# -> You can use the following CLI commands:
#     calc: Calculator commands

# Execute:
registry.parse_and_execute("calc add 10 20 30")
# -> {"result": 60.0}

๐Ÿ—๏ธ Core Architecture

โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                        LLM Output                            โ”‚
โ”‚                    "calc add 10 20 30"                      โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                          โ”‚
                          โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                      CommandRegistry                          โ”‚
โ”‚  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”  โ”‚
โ”‚  โ”‚  Parse   โ”‚โ”€โ–ถโ”‚ Validate โ”‚โ”€โ–ถโ”‚ Execute  โ”‚โ”€โ–ถโ”‚   Result   โ”‚  โ”‚
โ”‚  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜  โ”‚
โ”‚                                                             โ”‚
โ”‚  โ€ข Command hit/matching      โ€ข Lifecycle callbacks          โ”‚
โ”‚  โ€ข Help generation           โ€ข Error with suggestions        โ”‚
โ”‚  โ€ข Argument injection        โ€ข Chain execution (&&, ||, ;)  โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜
                          โ”‚
                          โ–ผ
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚                    Your Functions / Tools                    โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

๐Ÿ“‹ Key Features

Feature Description
๐Ÿ”Œ Multiple Registrations Decorators, class inheritance, dataclass, Pydantic, schema wrapping
โšก CLI Parsing Positional args, --option value, -o value, --opt=val, flags
๐Ÿ“– Smart Help Auto-generated usage, help text, LLM prompts
โœ… Validation Type coercion, requires/excludes, enums
๐Ÿ’ก Suggestions "Did you mean X?" for unknown commands/options/enums
๐Ÿ”„ Lifecycle Hooks before_execute, after_execute, on_error
๐Ÿƒ Internal Injection Hide callbacks/state from CLI, inject at runtime
๐Ÿ”— Chain Execution `cmd1 && cmd2

๐Ÿ“ Registration Patterns

1๏ธโƒฃ Decorator (Most Common)

from typing import Annotated
from llmcli import command, Option

@command(name="ls", description="List directory")
def ls(
    path: Annotated[str, Option(short='p', description="Directory path")],
    verbose: Annotated[bool, Option(short='v')] = False,
) -> list[str]:
    import os
    return os.listdir(path)

2๏ธโƒฃ Command Group

from llmcli import command_group, command

@command_group(name="db", description="Database operations")
class Database:
    @command(description="Create database")
    def create(self, name: str) -> None: ...

    @command(description="Drop database")
    def drop(self, name: str) -> None: ...

3๏ธโƒฃ Wrap Existing Tools

from llmcli import wrap_tool

class MyTool:
    name = "my_tool"
    description = "Does something"
    parameters = {"type": "object", "properties": {"x": {"type": "int"}}}
    async def execute(self, **kwargs): return kwargs

registry.register_spec(wrap_tool(MyTool()))

4๏ธโƒฃ Class Inheritance

from llmcli import CliCommand, CommandRegistry

class AddCommand(CliCommand):
    name = "add"
    description = "Add numbers"
    args_model = AddArgs

    async def run(self, **kwargs) -> dict:
        return {"result": sum(kwargs["values"])}

registry.register(AddCommand())

๐Ÿค– LLM Integration

Minimal Tool Schema

Expose only one exec tool to the LLM:

from llmcli import ExecTool

exec_tool = ExecTool(callback=registry.parse_and_execute)
# Tool schema: {name: "exec", params: {command: string, timeout?: int}}

Lifecycle Callbacks

from llmcli import ExecutionCallbacks

def on_error(ctx):
    print(f"Error: {ctx.error.code} - {ctx.error.message}")

registry = CommandRegistry(
    callbacks=ExecutionCallbacks(on_error=on_error)
)

Internal Parameter Injection

from typing import Annotated
from llmcli import Callback, State, command

@command(name="process")
def process(
    data: list[str],
    cache: Annotated[object, State(factory=lambda ctx: load_cache())] = None,
):
    # cache is injected automatically, hidden from CLI
    return cached_transform(data, cache)

๐Ÿ“ฆ Stable API

# Core
CommandRegistry
CommandRegistry.register(target)
CommandRegistry.register_spec(spec)
CommandRegistry.execute(command_str)
CommandRegistry.parse_and_execute(command_str)
CommandRegistry.render_help(command)
CommandRegistry.get_llm_prompt(detailed=False)

# Decorators
command(name=None, description="", aliases=None, hidden=False, deprecated=None)
command_group(name, description)

# Helpers
CliCommand
wrap_tool(tool)
command_from_model(name, model, handler)
command_from_method(name, target, method_name)
Option
Injected / Callback / State
ExecutionCallbacks
ExecTool

๐Ÿ’ก Examples

See example/demo.py for a complete calc system with OpenAI/Anthropic integration:

pip install "llmcli[examples]"
python -m example.demo --provider openai
python -m example.demo --provider anthropic

๐Ÿ“š Documentation

Language Link
๐Ÿ‡บ๐Ÿ‡ธ English docs/index_en.md
๐Ÿ‡จ๐Ÿ‡ณ ไธญๆ–‡ docs/index_zh.md

๐Ÿ“„ License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

agenticli-0.1.0.tar.gz (67.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

agenticli-0.1.0-py3-none-any.whl (31.3 kB view details)

Uploaded Python 3

File details

Details for the file agenticli-0.1.0.tar.gz.

File metadata

  • Download URL: agenticli-0.1.0.tar.gz
  • Upload date:
  • Size: 67.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.9 {"installer":{"name":"uv","version":"0.9.9"},"python":null,"implementation":{"name":null,"version":null},"distro":null,"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for agenticli-0.1.0.tar.gz
Algorithm Hash digest
SHA256 4fc664dda40774f878703ee9c300b60d32232c7a50f39cc1501d95830824736d
MD5 985a18f864e2231bc2fbf81d1e6f4ef2
BLAKE2b-256 e47d571dd6d58bf4f7e3dcafb2a5bc3b7da1c0a8892f035165f72cc20d00816c

See more details on using hashes here.

File details

Details for the file agenticli-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: agenticli-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 31.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.9 {"installer":{"name":"uv","version":"0.9.9"},"python":null,"implementation":{"name":null,"version":null},"distro":null,"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":null}

File hashes

Hashes for agenticli-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7f80aff60eb3f7a5c43292f8576387418be09e1438ff9f41898a73b307e55dd7
MD5 135ee3a904eab1530d302bfcc7248327
BLAKE2b-256 27ca83df7e10330c10db038754f9ba902801ac79cd3c562d9148c1170b010e08

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page