Skip to main content

Automates logfile and self-describing output file generation; provides Make-like functionality to re-run a script.

Project description

argrecord

An extension to argparse to automate the generation of logfiles and self-describing output files, and provide a Make-like functionality to re-run a script based on those automatically generated logfiles.

Introduction

This library can be used in the place of argparse. It provides additional functionality to create and read logfiles or datafile headers to document the command that created them and re-run the same command.

Additional decorators such as Gooey can still be used.

It works with Python 3.

The source code can be found at argrecord.

Usage

Recording script arguments

Simply replace argparse with argrecord and the class ArgumentParser with ArgumentRecorder.

The ArgumentRecorder class provides three new methods:

build_comments returns a multi-line string that represents the current script invocation; that is, the name of the script and the list arguments to it. We call these comments because they are designed to be included as the header of an output file and treated as a comment by whichever program subsequently treats the output file.

write_comments writes the comments to a file. The file can be specified either as a filename or a file object of an output file. Additional arguments specify whether additional comments (for example from an input file) or comments in the already existing file should be appended to the comments generated from the argument parser, and whether an already existing file should be backed by by appending a suffix to its name.

Appending multiple sets of commments in a single logfile or output file allows the entire chain of commands the produced that file to be recorded.

replay_required returns True or False indicating whether the script needs to be re-run. This is calculated by determining whether any of the input files to the script are newer than any of the currently existing output files.

The method add_argument takes three additional arguments. input and output indicate whether the argument represents the name of a file that is an input or output of the script. private indicates that the argument should not be included in the comments.

Replaying script arguments

Default behaviour

Run the script argreplay to re-run the commands that produced a logfile (or initial section of an output file). The default behaviour is to read a series of recipes from a logfile, detecting the name of the script and the input and output file(s) of each. Once it has read all the recipes, it processes them in reverse order. When it finds a command that needs to be re-run (because one or more of its input files is younger than one or more of its output files) it re-runs that command, then proceeds to the previous recipe. When a command is re-run, it typically creates a new output file that is an input file for the previous recipe, so that will in turn need to be re-run. The process continues until all the recipes have been processed, or a command returns an error.

Pipes

argreplay has some special features that allow it to replay sequences of commands in which the output of one is piped to the input of the next. When recording script arguments, if it encounters an input argument (one that was flagged with input) but no argument value, it assumes that the input came from standard input. Likewise for output arguments and standard output. When replaying a sequence in which a command writing to standard output is followed by one reading from standard input, a pipe is established between those two commands. Such a sequence may be arbitrarily long.

Variable substitution

A recipe may include variables in its command arguments. For example,

–year ${year}

In this case, argreplay must be given an argument --substitute that contains the variables to be substituted and the values with which to substitute them, with a colon as separator. For example, argreplay --substitute year:2019 ....

Other options

--dry-run simply prints the commands that would be run. Note that since the command is not actually run, output files are not touched and subsequent commands that would be run will not be listed.

--force means that the commands are run regardless of the timestamps on input and output file(s).

--depth indicates how many recipes to read from a logfile. The default is to read all the recipes.

--remove means that the logfile should be removed after it has been read but before any commands are run. This can help prevent the logfile growing too long if the commands will cause it to be extended.

--gui causes Gooey to be invoked if it is available.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

argrecord-0.1.3.tar.gz (26.8 kB view details)

Uploaded Source

Built Distribution

argrecord-0.1.3-py3-none-any.whl (21.9 kB view details)

Uploaded Python 3

File details

Details for the file argrecord-0.1.3.tar.gz.

File metadata

  • Download URL: argrecord-0.1.3.tar.gz
  • Upload date:
  • Size: 26.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.10

File hashes

Hashes for argrecord-0.1.3.tar.gz
Algorithm Hash digest
SHA256 00972c97119d0cef1f8bec00422526d469a94b9092e5ebf77cba584dbcea3647
MD5 ddd6abd0eddc16e337cf3309c17567c9
BLAKE2b-256 20ca0a0b2ac965ae677b4b1eced696ee8ecd0c6ec87612fd49cab4f73b88c28b

See more details on using hashes here.

File details

Details for the file argrecord-0.1.3-py3-none-any.whl.

File metadata

  • Download URL: argrecord-0.1.3-py3-none-any.whl
  • Upload date:
  • Size: 21.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.9.10

File hashes

Hashes for argrecord-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 d0766431051519e9af2b6f10185fc993114204c3b44686743a746e29246f4e9f
MD5 4b33189cad84e3151dec97d297c7fec0
BLAKE2b-256 17874bacccbe3fb2342ca457c77c2cde224b3d1f70e28891e2053b68dddf68f2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page