run dev scripts
Project description
ds: run dev scripts
Dev scripts are the short names we give to common tasks and long commands in a software project. ds
finds and runs dev scripts in your project's configuration file (e.g., Cargo.toml
, package.json
, pyproject.toml
, etc.):
pip install ds-run # or: uv tool install ds-run
ds --list # list the tasks
ds clean lint test # run multiple tasks
ds format:* # run tasks that match a glob
ds test -vv # pass arguments to tasks
ds -e PORT=8080 run # set environment variables
ds +cspell test # suppress errors
ds -w* build # supports monorepo/workspaces
Read more:
- Installing
ds
- Example configuration files
- When should I make a dev script?
- Where should I put my config?
- How does
ds
find my config? - Where do tasks run?
Example
Suppose you want to use pytest
with coverage
to run unit tests, doctests, and to generate a branch-level coverage report:
coverage run --branch --source=src -m \
pytest \
--doctest-modules \
--doctest-ignore-import-errors \
src test;
coverage report -m
Instead of typing that, we just add a script to our pyproject.toml
file:
[tool.ds.scripts]
test = """
coverage run --branch --source=src -m \
pytest \
--doctest-modules \
--doctest-ignore-import-errors \
src test;
coverage report -m
"""
And now you can run it with a quick shorthand:
ds test
Benefits
♻️ Works with existing projects
Almost a drop-in replacement for:
- Node (
package.json
):npm run
,yarn run
,pnpm run
,bun run
- Python (
pyproject.toml
):pdm run
,rye run
- PHP (
composer.json
):composer run-script
- Rust (
Cargo.toml
):cargo run-script
Experimental: We also support an extremely small subset of the Makefile
format (see #68).
See: Inspirations
🗂️ Add monorepo/workspace support anywhere
Easily manage monorepos and sub-projects, even if they use different tooling.
🏃 Run multiple tasks with custom arguments for each task
Provide command-line arguments for multiple tasks as well as simple argument interpolation.
🪄 Minimal magic
Tries to use familiar syntax and a few clear rules. Checks for basic cycles and raises helpful error messages if things go wrong.
🚀 Minimal dependencies
Currently working on removing all of these (see #46):
- python (3.8+)
tomli
(for python < 3.11)graphlib_backport
(for python < 3.9)
Limitations
ds
does not strive to be an all-in-one tool for every project and is not a replacement for package management tools or make
. Here are some things that are not supported or not yet implemented.
- Not Supported: Lifecycle Events
- Not Supported:
call
Tasks - Partial Support:
Makefile
format (see #68) - In Progress: Shell Completions (see #44)
- In Progress: Remove Python Dependency (see #46)
Install
ds
is typically installed at the system-level to make it available across all your projects.
python -m pip install ds-run
# or, if you use uv:
uv tool install ds-run
You can also download a Cosmopolitan binary which runs on Windows, macOS, and Linux:
export LOCAL_BIN=$HOME/.local/bin # or anywhere on your PATH
mkdir -p $LOCAL_BIN
wget -O $LOCAL_BIN/ds -N https://github.com/metaist/ds/releases/latest/download/ds
chmod +x $LOCAL_BIN/ds
ds --version
# NOTE: it takes a few seconds the first time you run it
If you just want to try ds
:
uvx --from ds-run ds --version
# or
pipx run ds-run --version
Usage
Usage: ds [--help | --version] [--debug]
[--dry-run]
[--no-config]
[--no-project]
[--list]
[--cwd PATH]
[--file PATH]
[--env-file PATH]
[(--env NAME=VALUE)...]
[--workspace GLOB]...
[--pre][--post]
[<task>...]
Options:
-h, --help
Show this message and exit.
--version
Show program version and exit.
--debug
Show debug messages.
--cwd PATH
Set the starting working directory (default: --file parent).
PATH is resolved relative to the current working directory.
--dry-run
Show which tasks would be run, but don't actually run them.
--env-file PATH
File with environment variables. This file is read before --env
values are applied.
-e NAME=VALUE, --env NAME=VALUE
Set one or more environment variables. Supersedes any values set in
an `--env-file`.
-f PATH, --file PATH
File with task and workspace definitions (default: search in parents).
Read more about the configuration file:
https://github.com/metaist/ds
-l, --list
List available tasks and exit.
--no-config
Do not search for or load a configuration file. Supersedes `--file`.
--no-project
Do not search for project dependencies, e.g., `.venv`, `node_modules`
-w GLOB, --workspace GLOB
Patterns which indicate in which workspaces to run tasks.
GLOB filters the list of workspaces defined in `--file`.
The special pattern '*' matches all of the workspaces.
Read more about configuring workspaces:
https://github.com/metaist/ds#workspaces
--pre, --post
EXPERIMENTAL: Run tasks with pre- and post- names.
<task>
One or more tasks to run with task-specific arguments.
The simplest way to pass arguments to tasks is to put them in quotes:
$ ds 'echo "Hello world"'
For more complex cases you can use a colon (`:`) to indicate start of arguments and double-dash (`--`) to indicate the end:
$ ds echo: "Hello from" -- echo: "the world"
If the first <option> starts with a hyphen (`-`), you may omit the
colon (`:`). If there are no more tasks after the last option, you
may omit the double-dash (`--`).
Tasks are executed in order across any relevant workspaces. If any
task returns a non-zero code, task execution stops unless the
<task> was prefixed with a (`+`) in which case execution continues.
Read more about error suppression:
https://github.com/metaist/ds#error-suppression
When should I make a dev script?
Typically, you should make a dev script for important steps in your development process. For example, most projects will need a way to run linters and unit tests (see the test
example above). Some projects also need a way to start up a server, fetch configuration files, or clean up generated files.
Dev scripts act as another form of documentation that helps developers understand how to build and work on your project.
Where should I put my config?
ds
supports .json
and .toml
configuration files (see examples) which typically go in the top-level of your project. To avoid making lots of top-level files, ds
can use common project configuration files.
- Node:
package.json
underscripts
- PHP:
composer.json
underscripts
- Python:
pyproject.toml
under[tool.ds.scripts]
([tool.pdm.scripts]
and[tool.rye.scripts]
also supported) - Rust:
Cargo.toml
under[package.metadata.scripts]
or[workspace.metadata.scripts]
- Other:
ds.toml
under[scripts]
Experimental: We support an extremely small subset of the Makefile
format (see #68).
Read more:
How does ds
find my config?
If you don't provide a config file using the --file
option, ds
will search the current directory and all of its parents for files with these name patterns in the following order:
ds.toml
pyproject.toml
uv.toml
package.json
Cargo.toml
composer.json
[Mm]akefile
If you provide one or more --workspace
options, the file must contain a workspace key. Otherwise, then the file must contain a task key.
If the appropriate key cannot be found, searching continues up the directory tree. The first file that has the appropriate key is used.
One exception to the search process is when using the --workspace
option: If a workspace member contains a file with the same name as the configuration file, that file is used within the workspace (e.g., a workspace defined in Cargo.toml
will try to find a Cargo.toml
in each workspace). Otherwise, the usual search process is used.
Where do tasks run?
Typically, tasks run in the same directory as the configuration file.
If you provide a --cwd
option (but not a --workspace
option), tasks will run in the directory provided by the --cwd
option.
If you provide one or more --workspace
options, --cwd
is ignored and tasks are run in each of the selected workspace members.
NOTE: In configuration files, you can use the cwd
or working_dir
option to specify a working directory for a specific task and that option will be respected even when using --workspace
or --cwd
from the command line.
Task Keys
ds
searches configuration files for tool-specific keys to find task definitions which should contain a mapping from task names to basic tasks or composite tasks.
Task Names
- Task names are strings, that are usually short, lowercase, ASCII letters.
- They can have a colon (
:
) in them, likepy:build
. - All leading and trailing whitespace in a task name is trimmed.
- If the name is empty or starts with a hash (
#
) it is ignored. This allows formats likepackage.json
to "comment out" tasks. - Don't start a name with a plus (
+
) because that indicates error suppression. - Don't start a name with a hyphen (
-
) because that can make the task look like a command-line argument. - Don't end a task name with a colon (
:
) because we use that to pass command-line arguments
Basic Task
A basic task is just a string of what should be executed in a shell using subprocess.run
.
- Supports most
pdm
-style andrye
-style commands (exceptcall
) - Supports argument interpolation
- Supports error suppression
# Example: Basic tasks become strings.
[scripts]
ls = "ls -lah"
no_error = "+exit 1" # See "Error Suppression"
# We also support `pdm`-style and `rye`-style commands.
# The following are all equivalent to `ls` above.
ls2 = { cmd = "ls -lah" }
ls3 = { cmd = ["ls", "-lah"] }
ls4 = { shell = "ls -lah" }
Composite Task
A composite task consists of a series of steps where each step is the name of another task or a shell command.
- Supports
pdm
-stylecomposite
andrye
-stylechain
- Supports argument interpolation
- Supports error suppression
# Example: Composite tasks call other tasks or shell commands.
[scripts]
build = "touch build/$1"
clean = "rm -rf build"
# We also support `pdm`-style and `rye`-style composite commands.
# The following are all equivalent.
all = ["clean", "+mkdir build", "build foo", "build bar", "echo 'Done'"]
pdm-style = { composite = [
"clean",
"+mkdir build", # See: Error Suppression
"build foo",
"build bar",
"echo 'Done'", # Composite tasks can call shell commands.
] }
rye-style = { chain = [
"clean",
"+mkdir build", # See: Error Suppression
"build foo",
"build bar",
"echo 'Done'", # Composite tasks can call shell commands.
] }
Argument Interpolation
Tasks can include parameters like $1
and $2
to indicate that the task accepts arguments.
You can also use $@
for the "remaining" arguments (i.e. those you haven't yet interpolated yet).
You can also specify a default value for any argument using a bash
-like syntax: ${1:-default value}
.
Arguments from a composite task precede those from the command-line.
# Example: Argument interpolation lets you pass arguments to tasks.
[scripts]
# pass arguments, but supply defaults
test = "pytest ${@:-src test}"
# interpolate the first argument (required)
# and then interpolate the remaining arguments, if any
lint = "ruff check $1 ${@:-}"
# we also support the pdm-style {args} placeholder
test2 = "pytest {args:src test}"
lint2 = "ruff check {args}"
# pass an argument and re-use it
release = """\
git commit -am "release: $1";\
git tag $1;\
git push;\
git push --tags;\
git checkout main;\
git merge --no-ff --no-edit prod;\
git push
"""
Command-line Arguments
When calling ds
you can specify additional arguments to pass to commands.
ds build: foo -- build: bar
This would run the build
task first with the argument foo
and next with the argument bar
.
A few things to note:
- the colon (
:
) after the task name indicates the start of arguments - the double dash (
--
) indicates the end of arguments
If the first argument to the task starts with a hyphen, the colon can be omitted. If there are no more arguments, you can omit the double dash.
ds test -v
If you're not passing arguments, you can put tasks names next to each other:
ds clean test
Error Suppression
If a task starts with a plus sign (+
), the plus sign is removed before the command is executed and the command will always produce an return code of 0
(i.e. it will always be considered to have completed successfully).
This is particularly useful in composite commands where you want subsequent steps to continue even if a particular step fails. For example:
# Example: Error suppression lets subsequent tasks continue after failure.
[scripts]
cspell = "cspell --gitignore '**/*.{py,txt,md,markdown}'"
format = "ruff format ."
die = "+exit 1" # returns error code of 0
die_hard = "exit 2" # returns an error code of 2 unless suppressed elsewhere
lint = ["+cspell", "format"] # format runs even if cspell finds misspellings
Error suppression works both in configuration files and on the command-line:
ds die_hard format
# => error after `die_hard`
ds +die_hard format
# => no error
Environment Variables
You can set environment variables on a per-task basis:
# Example: Environment variables can be set on tasks.
[scripts]
# set an environment variable
run = { cmd = "python -m src.server", env = { FLASK_PORT = "8080" } }
# use a file relative to the configuration file
run2 = { cmd = "python -m src.server", env-file = ".env" }
# composite tasks override environment variables
run3 = { composite = ["run"], env = { FLASK_PORT = "8081" } }
You can also set environment variables on the command-line, but the apply to all of the tasks:
ds -e FLASK_PORT=8080 run
ds --env-file .env run
Workspaces
Workspaces are a way of managing multiple sub-projects from a top-level. ds
supports npm
, rye
, uv
, and Cargo
style workspaces.
When ds
is called with the --workspace
option, the configuration file must have one of the tool-specific workspace keys.
If no configuration file was provided with the --file
option, search continues up the directory tree.
NOTE: pnpm
has its own pnpm-workspace.yaml
format which is not currently supported.
Workspace Members
The value corresponding to the workspace key should be a list of patterns that indicate which directories (relative to the configuration file) should be included as members. The following glob
-like patterns are supported:
?
: matches a single character (e.g.,ca?
matchescar
,cab
, andcat
)[]
: matches specific characters (e.g.,ca[rb]
matchescar
andcab
)*
: matches multiple characters, but not/
(e.g.,members/*
matches all the files inmembers
, but not further down the tree)**
: matches multiple characters, including/
(e.g.,members/**
matches all files inmembers
and all sub-directories and all of their contents)
If you prefix any pattern with an exclamation point (!
) then the rest of the pattern describes which files should not be matched.
Patterns are applied in order so subsequent patterns can include or exclude sub-directories as needed. We also support the excludes
key (for uv
and Cargo
) which is applied after all the members.
# Example: workspace includes everything in `members` except `members/x`.
[workspace]
members = ["members/*", "!members/x"]
Workspace Tasks
To run a task across multiple workspaces, use the --workspace
or -w
options one or more times with a pattern that indicates where the tasks should run.
For example, consider a workspace with directories members/a
, members/b
, and members/x
. The configuration above would match the first two directories and exclude the third.
The following are all equivalent and run test
in both member/a
and member/b
:
ds --workspace '*' test # special match that means "all workspaces"
ds -w '*' test # short option
ds -w* test # even shorter option
ds -w '*/a' -w '*/b' test # manually select multiple workspaces
Not Supported: Lifecycle Events
Some task runners (all the node
ones, pdm
, composer
) support running additional pre- and post- tasks when you run a task. However, this obscures the relationship between tasks and can create surprises if you happen to have two tasks with unfortunate names (e.g., pend
and prepend
). ds
does not plan to support this behavior (see #24).
As more explicit alternative is to use composite commands to clearly describe the relationship between a task and its pre- and post- tasks.
# Bad example: hidden assumption that `build` calls `prebuild` first.
[scripts]
prebuild = "echo 'prebuild'"
build = "echo 'build'"
# Good example: clear relationship between tasks.
[scripts]
prebuild = "echo 'prebuild'"
build = ["prebuild", "echo 'build'"]
Not Supported: call
Tasks
Some task runners support special call
tasks which get converted into language-specific calls. For example, both pdm
and rye
can call
into python packages and composer
can call
into a PHP module call.
These types of tasks introduces a significant difference between what you write in the configuration file and what gets executed, so in the interest of reducing magic, ds
does not currently support this behavior (see #32).
A more explicit alternative is to write out the call you intend:
# {"call": "pkg"} becomes:
python -m pkg
# {"call": "pkg:main('test')"} becomes:
python -c "import sys; from pkg import main as _1; sys.exit(main('test'))"
Inspirations
I've used several task runners, usually as part of build tools. Below is a list of tools used or read about when building ds
.
-
1976:
make
(C) - Together with its descendants,make
is one of the most popular build & task running tools. It is fairly easy to make syntax errors and the tab-based indent drives me up the wall. -
2000:
ant
(Java) - an XML-based replacement formake
. I actually liked usingant
quite a bit until I stopped writing Java and didn't want to havejava
as a dependency for mypython
projects. -
2008:
gradle
(Groovy/Kotlin) - Written for thejvm
, I pretty much only use this for Android development. Can't say I love it. -
2010:
npm
(JavaScript) - Being able to add a simplescripts
field topackage.json
made it very easy to run dev scripts. Supportspre
andpost
lifecycle tasks. -
2010:
pdm
(Python) - Supports 4 different types of tasks includingcmd
,shell
,call
, andcomposite
. -
2012:
composer
(PHP) - Usescomposer.json
, similar topackage.json
. Supports pre- and post- task lifecycle for special tasks, command-line arguments, composite tasks, and other options. -
2016:
yarn
(JavaScript) - An alternative tonpm
which also supports command-line arguments. -
2016:
pnpm
(JavaScript) - Another alternative tonpm
which supports many more options including running tasks in parallel. -
2016:
just
(Rust) - Defines tasks in ajustfile
, similar tomake
. Supports detecting cycles, running parallel, and many other options. -
2016:
cargo-run-script
(Rust) - UsesCargo.toml
to configure scripts and supports argument substitution ($1
,$2
, etc.). -
2017:
cargo-make
(Rust) - Very extensive port ofmake
to Rust defining tasks inMakefile.toml
. -
2022:
hatch
(Python) - Defines environment-specific scripts with the ability to suppress errors, likemake
. -
2023:
bun
(Zig) - An alternative tonode
andnpm
. -
2023:
rye
(Rust) - Up-and-coming replacement for managing python projects.
License
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file ds_run-1.3.0.tar.gz
.
File metadata
- Download URL: ds_run-1.3.0.tar.gz
- Upload date:
- Size: 48.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/4.0.2 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 11d7f6702017c92fe8a4828ddb9cc1627a8f6b27ef30ae1db9336c0026c56cb5 |
|
MD5 | dca389fb9a9905e7f3bbc1b6a3e8f353 |
|
BLAKE2b-256 | 8de84a15940c011412c28e5fff7deb18c9d2585ee3a4634cee05c7cd7117e2cf |
File details
Details for the file ds_run-1.3.0-py3-none-any.whl
.
File metadata
- Download URL: ds_run-1.3.0-py3-none-any.whl
- Upload date:
- Size: 44.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/4.0.2 CPython/3.11.9
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9e2f6c9aeed46da347bf7146ac161b8e6b0638fe14d080a1fed0b688e7286d72 |
|
MD5 | 70168660264e9eb35f4fbf9793704a8f |
|
BLAKE2b-256 | eaa36b17b48eb22f4bce4e7c626c8e214c7ab008cea9e44637890d11780e9b8d |