Skip to main content

A human-readable tar format for text files, based on grep . -r output

Project description

human-tar

@readwithai - X - blog - machine-aided reading - 📖⚡️🖋️

A human-readable tar format for text files. Easy for AIs to write and read.

Motivation

I've been doing a little vibe-coding with online LLMs and found myself occasionally producing a number of files to share. While some online LLMs can produce tar files, this process is often buggy and slow compared to producing output like this (using prompts)

Alternatives and prior Work

This format is based on the output of grep . -r . - which can be used to produce output to give to an LLM (with the appropriate ignore flags)

You could use cursor/windsurf or another AI tool to circumvent the need for this sort of tool. There are various tools to wrap up a codebase ready to be sent into an AI, but not necessarily the other way.

Installation

Install human-tar from PyPI using pipx

pipx install human-tar

Usage

human-tar will give you the output for the current git repo in this format.

human-untar unpacks the output in the form of grep . -r into the original file structure. It reads input from a file or stdin and writes files to the current directory by default.

Examples

As a demonstraction, this command produces human-tar input using grep and feeds this into human-tar.

grep . -r /path/to/dir | human-untar

You can also provide a path on the current directory or from the clipboard using xclip on linux or pblaste on mac

human-untar file.txt
human-untar <(xclip -o -selection CLIPBOARD)

For testing purposes, you might want to output into a different directory using the -o option

human-tar file.text -o file

Input Format

The input should be in the format of grep . -r output, e.g.:

src/main.c:int main() {
src/main.c:    printf("Hello, world!\n");
src/utils/helper.c:void help() {

This will create (in the current directory by default):

./
├── src/
│   ├── main.c
│   └── utils/
│       └── helper.c

Options

  • -o, --output-dir: Specify the output directory (default: current directory).
  • Example: human-tar -o my_output_dir grep_output.txt

About me

I am @readwithai. I create tools for reading, research and agency sometimes using the markdown editor Obsidian.

I also create a stream of tools like this that are related to carrying out my work. As users of tool are likely interesting in AI you might like to read my blog about tools for reading with ai.

I write about lots of things - including tools like this - on X. My blog is more about reading and research and agency.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

human_tar-1.0.0.tar.gz (5.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

human_tar-1.0.0-py3-none-any.whl (6.5 kB view details)

Uploaded Python 3

File details

Details for the file human_tar-1.0.0.tar.gz.

File metadata

  • Download URL: human_tar-1.0.0.tar.gz
  • Upload date:
  • Size: 5.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for human_tar-1.0.0.tar.gz
Algorithm Hash digest
SHA256 04838300f8679a08f880e42531894ce56675f454a386096f78427500ba1fe6b1
MD5 9208bebda7af3b90124496dae95fb52d
BLAKE2b-256 9dbb35192e9e1a489f8eab57e3abf87799bb85b8dcdb0fb88db9a6723016527d

See more details on using hashes here.

File details

Details for the file human_tar-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: human_tar-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 6.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for human_tar-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b50ec9af5fe3a9fbc9133614d6600a2b77554c9f6daae58f5eae1ecfe6d4a3ec
MD5 7c2e24e20251f7473e0e12052ff7410b
BLAKE2b-256 7b5eef7eafc469e522d2e039ad65af56b0d576ba135188e625145b337085ca9e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page