Skip to main content

General automation driver

Project description

ZAMM

This is an informal automation tool where you show GPT how to do something, and have it do it for you afterwards. This is good for boring but straightforward tasks that you haven't gotten around to writing a proper script to automate.

We are entering a time when our target audiences may include machines as well as humans. As such, this tool will generate tutorials that you can edit to make pleasant for both humans and LLMs alike to read.

This is an experimental tool, and has only been run on WSL Ubuntu so far. It seems to work ok on the specific examples below. YMMV. Please feel free to add issues or PRs.

Quickstart

pipx recommended over pip for install because it should allow you to run this with a different version of langchain than the one you might have installed:

pipx install zamm

Teach GPT to do something:

zamm teach

You will be roleplaying the LLM. The results of your interaction will be output as a Markdown tutorial file, which you can then edit to be more human-readable. See this example of teaching the LLM how to create a "Hello world" script.

Afterwards, you can tell the LLM to do a slightly different task using that same tutorial:

zamm execute --task 'Write a script goodbye.sh that prints out "Goodbye world". Execute it.' --documentation zamm/resources/tutorials/hello.md

This results in this example transcript of LLM interactions. Note that GPT successfully generalizes from the tutorial to code in a completely different language based just on the difference in filenames. Imagine having to manually add that feature to a script!

Using internal tutorials

Select any of the prepackaged tutorials as documentation by prefacing their filename with @internal. The .md extension is optional.

For example:

zamm execute --task 'Protect the `main` branch' --documentation @internal/branch-protection

to protect the main branch of the project in the current directory on Github. (Note that this tutorial was written in mind for ZAMM-built projects, so YMMV for using this on custom projects.)

Sessions

Sessions are recorded in case a crash happens, or if you want to change something up. On Linux, sessions are saved to ~/.local/share/zamm/sessions/. To continue from the most recent session, run

zamm teach --last-session

Free-styling

You can also simply tell the LLM to do something without teaching it to do so beforehand. However, this is a lot more brittle. An example of a free-style command that works:

zamm execute --task 'Write a script hello.py that prints out "Hello world". Execute it.'

The resulting transcript can be found here.

Prompting

When a step is failing and you need faster iteration by repeatedly testing a single prompt, you can do so with the prompt command. First, write your prompt out to a file on disk. Then run this command:

zamm prompt --stop '\n' --raw <path-to-prompt>

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

zamm-0.0.5.tar.gz (60.1 kB view details)

Uploaded Source

Built Distribution

zamm-0.0.5-py3-none-any.whl (89.7 kB view details)

Uploaded Python 3

File details

Details for the file zamm-0.0.5.tar.gz.

File metadata

  • Download URL: zamm-0.0.5.tar.gz
  • Upload date:
  • Size: 60.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.1 CPython/3.10.9 Linux/5.10.74.3-microsoft-standard-WSL2

File hashes

Hashes for zamm-0.0.5.tar.gz
Algorithm Hash digest
SHA256 c0fc8d7804199ddc4a48e0a863170918302b362cc865c5701f5b91115f0b9cdd
MD5 1b4c374f49ece3a2d5c12d93b046b805
BLAKE2b-256 42ace0cd95d6ee73ba94de6bd9a906142e4fd1ae5ccd3241cc958f59f97ccf6b

See more details on using hashes here.

File details

Details for the file zamm-0.0.5-py3-none-any.whl.

File metadata

  • Download URL: zamm-0.0.5-py3-none-any.whl
  • Upload date:
  • Size: 89.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.3.1 CPython/3.10.9 Linux/5.10.74.3-microsoft-standard-WSL2

File hashes

Hashes for zamm-0.0.5-py3-none-any.whl
Algorithm Hash digest
SHA256 bedecfc93849ea08f5c47cee1ea5d8f03429e27c9d57e9223d482056d3cbc0b2
MD5 afdd224b868c1924d96b97c0240f6488
BLAKE2b-256 aeed46eb084b0bbfce68de5907b403c0d24d6cdad601059d49f79200e6ad7ba8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page