Skip to main content

Local-first CLI agent for generating small scripts using local LLMs

Project description

▸ scripy

Generate small, single-file scripts using locally hosted LLMs. One command, no cloud.

scripy -p "rename all my jpegs by date taken"

scripy runs an agentic loop — generate, validate, self-correct, write — entirely on your machine via Ollama or LM Studio.


Install

Requires Python 3.11+ and a running Ollama or LM Studio instance.

pipx install .

Or for development:

pip install -e .

Quick start

Note: This assumes you're running Ollama on the same machine with qwen2.5-coder:7b installed. See Configuration below for more detialed configuration steps.

# Generate a Python script
scripy -p "find duplicate files in a directory"

# Specify output file
scripy -p "find duplicate files in a directory" -o dedup.py

# Generate Bash
scripy -p "backup my home directory to /tmp" --lang bash

# Modify an existing script
scripy -p "add a --dry-run flag" --input dedup.py

# Skip all confirmation prompts (for scripting)
scripy -p "..." -y

Confirmation gates

scripy prompts before any side-effectful action.

Before sandboxed execution:

  ? run script to validate? [y/n/e/v/a] ›
Key Action
y Run it
n Skip — model continues without sandbox feedback
e Open in $EDITOR before running
v Print script to terminal, then re-prompt
a Always yes — skip gate for all remaining iterations

Before writing to disk:

  ? write to disk as dedup.py? [y/n] ›

Use -y / --yes to bypass all gates non-interactively.


Configuration

Default config works with a local Ollama instance. Override via ~/.config/scripy/config.toml:

[model]
base_url    = "http://192.168.1.10:11434/v1"  # remote Ollama
model       = "llama3.1:8b"
api_key     = "ollama" # use "lm-studio" for LM Studio
temperature = 0.2
max_tokens  = 4096

[agent]
force_tools      = true
max_iterations   = 3
default_lang     = "python"
sandbox_timeout  = 10

CLI flags override config for a single run:

scripy --model qwen2.5-coder:7b -p "..."

Model recommendations

scripy targets models that run on small consumer hardware (~4–8GB RAM) but will obviously excell on more powerful machines. That being said here are some reccomendations for small machines.

Model Size Tool calling Code quality Notes
llama3.1:8b ~4.7GB Native Good Recommended default
llama3.2:3b ~2.0GB Native Fair Best ultra-low-resource option
qwen2.5-coder:7b ~4.4GB Inline only Excellent Best code quality; see note below
deepseek-coder:6.7b ~4.0GB Inline only Excellent Same trade-off as qwen-coder

Tool calling — what this means in practice

scripy uses the OpenAI function-calling API to let the model invoke tools (write_file, run_script, read_file, list_directory). Models fall into two categories:

Native tool calling (llama3.1, llama3.2) — the model returns structured tool_calls in the API response. The agentic loop runs cleanly; run_script validation and multi-turn self-correction work as intended.

Inline tool calling (qwen2.5-coder, deepseek-coder) — these models ignore the tool-calling API and instead append a JSON blob to their text response. scripy detects and handles this automatically, but the behaviour is less reliable: the self-correction loop may not fire, and you will see:

  ~ inline tool call detected — model is not using structured tool calling

--force-tools (tool_choice=required) is on by default. This works well with native tool-calling models and is generally the right choice. If you see errors or garbled output — which can happen with certain qwen-coder or deepseek-coder builds on older Ollama versions — disable it via config:

# ~/.config/scripy/config.toml
[agent]
force_tools = false

With force_tools off, scripy falls back to inline tool-call detection automatically. I've personally had better results using LM Studio for smaller models on small hardware (think M2 mac mini 8GB RAM). Sometimes using OLLama the model kept appending tool calls to the script output.


CLI reference

Usage: scripy [OPTIONS]

  scripy — generate scripts with local LLMs.

Options:
  -p, --prompt TEXT   What script to generate.
  -o, --output TEXT   Output file path.
  -l, --lang TEXT     Language override (python, bash, etc.).
  --model TEXT        Model name override.
  --input TEXT        Existing script to modify.
  --tui               Launch Textual TUI.
  -y, --yes           Skip all confirmation gates.
  --force-tools       Override config and set tool_choice=required for this run.
  --version           Show version and exit.
  --help              Show this message and exit.

Current state

Phase 1 — skeleton & configPhase 2 — headless agent

  • Generate → syntax validate → sandbox run → self-correct → write loop
  • Confirmation gates (run / write) with keyboard shortcuts
  • Inline tool call detection and fallback for models that don't use the tool-calling API
  • ~/.config/scripy/config.toml config with CLI overrides

Phase 3 — TUI

  • Textual TUI with live script preview, diff view on revision, and confirmation gates
  • Language picker (Ctrl+L), inline prompt compose, and refinement loop

Development

pip install -e ".[dev]"
pytest

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scripy_cli-0.1.0.tar.gz (29.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

scripy_cli-0.1.0-py3-none-any.whl (24.2 kB view details)

Uploaded Python 3

File details

Details for the file scripy_cli-0.1.0.tar.gz.

File metadata

  • Download URL: scripy_cli-0.1.0.tar.gz
  • Upload date:
  • Size: 29.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.2

File hashes

Hashes for scripy_cli-0.1.0.tar.gz
Algorithm Hash digest
SHA256 21bf43dcf8c2c8833d724fc39afdb7b03c22e78308b9325902995d4a9925dce9
MD5 69445fa40ac94c0e172d2404d9f62a04
BLAKE2b-256 72b94b21a43b843b214182d5549f856cc162a8f1f4986b28eeea67ac735dfd70

See more details on using hashes here.

File details

Details for the file scripy_cli-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: scripy_cli-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 24.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.11.2

File hashes

Hashes for scripy_cli-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b0a34d6c25372a9aa2ea5c452fc9233329ed3cdcda39ee2dff16792c82728788
MD5 dbe116630ef5f91da3b83cd870a7e457
BLAKE2b-256 e4a31070cbadbdcca36ff23247654f3767cfa9bc9b4469eae4ddfe7f7b50e3ec

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page