Collection of invoke commands used by Saritasa
Project description
saritasa-invocations
Collection of invoke commands used by Saritasa
Table of contents
- Installation
- Configuration
- Modules
- printing
- system
- git
- pre-commit
- docker
- github-actions
- python
- django
- django.manage
- django.makemigrations
- django.migrate
- django.resetdb
- django.createsuperuser
- django.run
- django.shell
- django.dbshell
- django.django.recompile-messages
- django.show-urls
- django.load-db-dump
- django.backup-local-db
- django.backup-remote-db
- django.load-remote-db
- django.startapp
- django.wait-for-database
- fastapi
- alembic
- celery
- open-api
- db
- k8s
- db-k8s
- cruft
- poetry
- uv
- pip
- mypy
- pytest
- secrets
Installation
pip install saritasa-invocations
or if you are using poetry
poetry add saritasa-invocations
or if you are using uv
uv add saritasa-invocations
Global installation
You can use uvx to use packages globally without installing them or activating virtualenvs.
uvx saritasa-invocations pre-commit.run-hooks
Or if you need extras
uvx --from="saritasa-invocations[env_settings]" pre-commit.run-hooks
Or simply create alias for simple usage
alias saritasa-inv="uvx saritasa-invocations"
saritasa-inv pre-commit.run-hooks
Configuration
Configuration can be set in tasks.py file.
Below is an example of config:
import invoke
import saritasa_invocations
ns = invoke.Collection(
saritasa_invocations.docker,
saritasa_invocations.git,
saritasa_invocations.github_actions,
saritasa_invocations.pre_commit,
saritasa_invocations.system,
)
# Configurations for run command
ns.configure(
{
"run": {
"pty": True,
"echo": True,
},
"saritasa_invocations": saritasa_invocations.Config(
pre_commit=saritasa_invocations.PreCommitSettings(
hooks=(
"pre-commit",
"pre-push",
"commit-msg",
)
),
git=saritasa_invocations.GitSettings(
merge_ff="true",
pull_ff="only",
),
docker=saritasa_invocations.DockerSettings(
main_containers=(
"opensearch",
"redis",
),
),
system=saritasa_invocations.SystemSettings(
vs_code_settings_template=".vscode/recommended_settings.json",
settings_template="config/.env.local",
save_settings_from_template_to="config/.env",
),
# Default K8S Settings shared between envs
k8s_defaults=saritasa_invocations.K8SDefaultSettings(
proxy="teleport.company.com",
db_config=saritasa_invocations.K8SDBSettings(
namespace="db",
pod_selector="app=pod-selector-db",
),
)
),
},
)
# For K8S settings you just need to create a instances of K8SSettings for each
# environnement. It'll be all collected automatically.
saritasa_invocations.K8SSettings(
name="dev",
cluster="teleport.company.somewhere.com",
namespace="project_name",
)
saritasa_invocations.K8SSettings(
name="prod",
cluster="teleport.client.somewhere.com",
namespace="project_name",
proxy="teleport.client.com",
)
Modules
printing
While this module doesn't contain any invocations, it's used to print message
via rich.panel.Panel. There are three types:
print_success- print message in green panelprint_warning- print message in yellow panelprint_error- print message in red panel
system
system.copy-local-settings
Copies local template for settings into specified file
Settings:
settings_templatepath to settings template (Default:config/settings/local.template.py)save_settings_from_template_topath to where save settings (Default:config/settings/local.py)
system.copy-vscode-settings
Copies local template for vscode settings into .vscode folder
Settings:
vs_code_settings_templatepath to settings template (Default:.vscode/recommended_settings.json)
system.chown
Change ownership of files to user(current user by default).
Shortcut for owning apps dir by specified user after some files were generated using docker-compose (migrations, new app, etc).
system.create-tmp-folder
Create folder for temporary files(.tmp).
git
git.set-git-setting
Set git setting in config
git.setup
Preform setup of git:
- Install pre-commit hooks
- Set merge.ff
- Set pull.ff
Settings:
merge_ffsetting value formerge.ff(Default:false)pull_ffsetting value forpull.ff(Default:only)
git.clone-repo
Clone repo or pull latest changes to specified repo
git.blame-copy
Command for creating copies of a file with git blame history saving.
Original script written in bash
Usage:
inv git.blame-copy <path to original file> <path to copy>,<path to copy>...
If <path to copy> is file, then data will be copied in it.
If <path to copy> is directory, then data will be copied in provided
directory with original name.
Algorithm:
- Remember current HEAD state
- For each copy path:
move file to copy path, restore file using
checkout, remember result commits - Restore state of branch
- Move file to temp file
- Merge copy commits to branch
- Move file to it's original path from temp file
Settings:
copy_commit_templatetemplate for commits created during command workflowcopy_init_message_templatetemplate for init message printed at command start
Template variables:
action- The copy algorithm consists of several intermediate actions (creating temporary files, merging commits, etc.) Theactionvariable stores the header of the intermediate action.original_path- Contains value of first argument of the command (path of original file that will be copied)destination_paths- Sequence of paths to which the original file will be copiedproject_task- project task that will be parsed from current git branch. If no task found in branch, then will be empty
Default values for templates:
copy_commit_template:
"[automated-commit]: {action}\n\n"
"copy: {original_path}\n"
"to:\n* {destination_paths}\n\n"
"{project_task}"
copy_init_message_template:
"Copy {original_path} to:\n"
"* {destination_paths}\n\n"
"Count of created commits: {commits_count}"
pre-commit
pre-commit.install
Install git hooks via pre-commit.
pre-commit.uninstall
Uninstall git hooks via pre-commit.
pre-commit.run-hooks
Run all hooks against all files.
pre-commit.update
Update pre-commit dependencies.
docker
docker.build-service
Build service image from docker compose
docker.buildpack
Build project via pack-cli
Settings:
buildpack_builderimage tag of builder (Default:paketobuildpacks/builder:base)buildpack_runnerimage tag of runner (Default:paketobuildpacks/run:base)build_image_tagimage tag of builder (Default: Name of project fromproject_name)buildpack_requirements_pathpath to folder with requirements (Default:requirements)
docker.stop-all-containers
Shortcut for stopping ALL running docker containers
docker.up
Bring up main containers and start them.
Settings:
main_containersimage tag of builder (Default:["postgres", "redis"])
docker.stop
Stop main containers.
Settings:
main_containersimage tag of builder (Default:["postgres", "redis"])
docker.clear
Stop and remove all containers defined in docker-compose. Also remove images.
github-actions
github-actions.set-up-hosts
Add hosts to /etc/hosts.
Settings:
hostsimage tag of builder (Default: seedocker-main-containers)
python
As of now we support two environments for python local and docker.
localis a python that is located in your current virtualenvdockeris python that is located inside your docker image of service (python_docker_service).
This was done to have ability to run code against environment close deployed one or simply test it out.
Example of usage
PYTHON_ENV=docker inv python.run --command="--version"
python.run
Run python command depending on PYTHON_ENV variable(docker or local).
Settings:
entrypython entry command (Default:python)docker_servicepython service name (Default:web)docker_service_paramsparams for docker (Default:--rm)
django
django.manage
Run manage.py with specified command.
This command also handle starting of required services and waiting DB to be ready.
Requires django_probes
Settings:
manage_file_pathpath tomanage.pyfile (Default:./manage.py)
django.makemigrations
Run makemigrations command and chown created migrations (only for docker env).
django.check_new_migrations
Check if there is new migrations or not. Result should be check via exit code.
django.migrate
Run migrate command.
Settings:
migrate_commandmigrate command (Default:migrate)
django.resetdb
Reset database to initial state (including test DB).
Requires django-extensions
Settings:
settings_pathdefault django settings (Default:config.settings.local)
django.createsuperuser
Create superuser.
Settings:
default_superuser_emaildefault email of superuser. if empty, will try to grab it from git config, before resorting to default (Default:root@localhost)default_superuser_usernamedefault username of superuser if empty, will try to grab it from git config, before resorting to default (Default:root)default_superuser_passworddefault password of superuser (Default:root)verbose_email_nameverbose name foremailfield (Default:Email address)verbose_username_nameverbose name forusernamefield (Default:Username)verbose_password_nameverbose name forpasswordfield (Default:Password)
Note:
- Values for
verbose_email_name,verbose_username_name,verbose_password_nameshould match with verbose names of model that used this setting
django.run
Run development web-server.
Settings:
runserver_docker_paramsparams for docker (Default:--rm --service-ports)runserver_commandrunserver command (Default:runserver_plus)runserver_hosthost of server (Default:0.0.0.0)runserver_portport of server (Default:8000)runserver_paramsparams for runserver command (Default:"")
django.shell
Shortcut for manage.py shell command.
Settings:
shell_commandcommand to start python shell (Default:shell_plus --ipython)
django.dbshell
Open database shell with credentials from current django settings.
django.recompile-messages
Generate and recompile translation messages.
Requires gettext
Settings:
makemessages_paramsparams for makemessages command (Default:--all --ignore venv)compilemessages_paramsparams for compilemessages command (Default:"")
django.show-urls
Show urls of project which can be filtered via search parameter.
django.load-db-dump
Reset db and load db dump.
Uses resetdb and load-db-dump
Settings:
django_settings_pathdefault django settings (Default:config.settings.local)
django.backup-local-db
Back up local db.
Uses backup_local_db
Settings:
settings_pathdefault django settings (Default:config.settings.local)
django.backup-remote-db
Make dump of remote db and download it.
Uses create_dump and get-dump
It can use usual django config where every setting is stored in separate variable or single variable with full db url.
Settings:
-
settings_pathdefault django settings (Default:config.settings.local) -
remote_db_url_config_nameName of config for db url (Default:DATABASE_URL) -
remote_db_config_mappingMapping of db config Default:{ "dbname": "RDS_DB_NAME", "host": "RDS_DB_HOST", "port": "RDS_DB_PORT", "username": "RDS_DB_USER", "password": "RDS_DB_PASSWORD", }
django.load-remote-db
Make dump of remote db and download it and apply to local db.
Uses create_dump and get-dump and load-db-dump
Settings:
settings_pathdefault django settings (Default:config.settings.local)
django.startapp
Create django app from a template using copier.
Requires uv: installation docs
Settings:
app_boilerplate_linklink to app templateapps_pathpath to apps folder in project (Default:apps)
django.wait-for-database
Launch docker compose and wait for database connection.
fastapi
fastapi.run
Run development web-server.
Settings:
docker_paramsparams for docker (Default:--rm --service-ports)uvicorn_commanduvicorn command (Default:-m uvicorn)apppath to fastapi app (Default:config:fastapi_app)hosthost of server (Default:0.0.0.0)portport of server (Default:8000)paramsparams for uvicorn (Default:--reload)
alembic
alembic.run
Run alembic command
Settings:
commandalembic command (Default:-m alembic)connect_attemptsnumbers of attempts to connect to database (Default:10)
alembic.autogenerate
Generate migrations
Settings:
migrations_foldermigrations files location (Default:db/migrations/versions)
alembic.upgrade
Upgrade database
alembic.downgrade
Downgrade database
alembic.check-for-migrations
Check if there any missing migrations to be generated
alembic.check-for-adjust-messages
Check migration files for adjust messages
Settings:
migrations_foldermigrations files location (Default:db/migrations/versions)adjust_messageslist of alembic adjust messages (Default:# ### commands auto generated by Alembic - please adjust! ###,# ### end Alembic commands ###)
alembic.load-db-dump
Reset db and load db dump.
Uses downgrade and load-db-dump
Requires python-decouple
Installed with [env_settings]
Settings:
-
db_config_mappingMapping of db configDefault:
{ "dbname": "rds_db_name", "host": "rds_db_host", "port": "rds_db_port", "username": "rds_db_user", "password": "rds_db_password", }
alembic.backup-local-db
Back up local db.
Uses backup_local_db
Requires python-decouple
Installed with [env_settings]
Settings:
-
db_config_mappingMapping of db configDefault:
{ "dbname": "rds_db_name", "host": "rds_db_host", "port": "rds_db_port", "username": "rds_db_user", "password": "rds_db_password", }
alembic.backup-remote-db
Make dump of remote db and download it.
Uses create_dump and get-dump
Requires python-decouple
Installed with [env_settings]
Settings:
-
db_config_mappingMapping of db configDefault:
{ "dbname": "rds_db_name", "host": "rds_db_host", "port": "rds_db_port", "username": "rds_db_user", "password": "rds_db_password", }
alembic.load-remote-db
Make dump of remote db and download it and apply to local db.
Uses create-dump and get-dump and load-db-dump
Requires python-decouple
Installed with [env_settings]
Settings:
-
db_config_mappingMapping of db configDefault:
{ "dbname": "rds_db_name", "host": "rds_db_host", "port": "rds_db_port", "username": "rds_db_user", "password": "rds_db_password", }
alembic.wait-for-database
Launch docker compose and wait for database connection.
celery
celery.run
Start celery worker.
Settings:
apppath to app (Default:config.celery.app)schedulerscheduler (Default:django)loglevellog level for celery (Default:info)extra_paramsextra params for worker (Default:("--beat",))local_cmdcommand for celery (Default:celery --app {app} worker --scheduler={scheduler} --loglevel={info} {extra_params})service_namename of celery service (Default:celery)
celery.send-task
Send task to celery worker.
Settings:
apppath to app (Default:config.celery.app)
open-api
open-api.validate-swagger
Check that generated open_api spec is valid. This command uses drf-spectacular and it's default validator. It creates spec file in ./tmp folder and then validates it.
db
db.load-db-dump
Load db dump to local db.
Settings:
load_dump_commandtemplate for load command(Default located in_config.pp > dbSettings)dump_filenamefilename for dump (Default:local_db_dump)load_additional_paramsadditional params for load command (Default:--quite)
db.backup-local-db
Back up local db.
Settings:
dump_commandtemplate for dump command (Default located in_config.pp > dbSettings)dump_filenamefilename for dump (Default:local_db_dump)dump_additional_paramsadditional params for dump command (Default: ``)dump_no_owneradd--no-ownerto dump command (Default:True)dump_include_tableadd--table={dump_include_table}to dump command (Default: ``)dump_exclude_tableadd--exclude-table={dump_exclude_table}to dump command (Default: ``)dump_exclude_table_dataadd--exclude-table-data={dump_exclude_table_data}to dump command (Default: ``)dump_exclude_extensionadd--exclude-extension={dump_exclude_extension}to dump command (Default: ``)
k8s
For K8S settings you just need to create a instances of K8SSettings for each
environnement. It'll be all collected automatically.
k8s.login
Login into k8s via teleport.
Settings:
proxyteleport proxy (REQUIRED)clusterkube cluster (Default: Uses value fromproxy)portteleport port (Default:443)authteleport auth method (Default:github)
k8s.set-context
Set k8s context to current project. By default uses dev environment.
Settings:
namespacenamespace for k8s (Default: Name of project fromproject_name)contextName of context (REQUIRED)
k8s.logs
Get logs for k8s pod
Settings:
default_componentdefault component (Default:backend)
k8s.pods
Get pods from k8s.
k8s.execute
Execute command inside k8s pod.
How to use env-params arg
Say you have Procfile with this entry
celery_start_task: celery --app config.celery:app call ${task}
So you can make this invocation
inv k8s.execute --entry="celery_start_task" --env-params="task=apps.project.tasks.do_the_thing"
Or create you own invocation which could lead to something like this
inv project.k8s_start_celery_task --task=apps.project.tasks.do_the_thing
Settings:
default_componentdefault component (Default:backend)default_entrydefault entry cmd (Default:/cnb/lifecycle/launcher)default_commanddefault cmd entry for entry cmd (Default:bash) only used fordefault_entry
k8s.python-shell
Enter python shell inside k8s pod.
Settings:
default_componentdefault component (Default:backend)python_shellshell cmd (Default:shell_plus)
k8s.health-check
Check health of component.
Settings:
default_componentdefault component (Default:backend)health_checkhealth check cmd (Default:health_check)
k8s.download-file
Download file from pod.
default_componentdefault component (Default:backend)
db-k8s
While you probably won't use this module directly some other modules commands are use it(getting remote db dump)
Make sure to set up these configs:
pod_namespacedb namespace (REQUIRED)pod_selectorpod selector for db (REQUIRED)
db-k8s.create-dump
Execute dump command in db pod.
Settings:
pod_namespacedb namespace (REQUIRED)pod_selectorpod selector for db (REQUIRED)get_pod_name_commandtemplate for fetching db pod (Default located in_config.pp > K8SdbSettings)dump_filename_templatetemplate for dump filename (Default:{project_name}-{env}-{timestamp:%Y-%m-%d}-db-dump.{extension})dump_commanddump command template (Default located in_config.pp > K8SDBSettings)dump_dirfolder where to put dump file (Default:tmp)dump_additional_paramsadditional params for dump command (Default: ``)dump_no_owneradd--no-ownerto dump command (Default:True)dump_include_tableadd--table={dump_include_table}to dump command (Default: ``)dump_exclude_tableadd--exclude-table={dump_exclude_table}to dump command (Default: ``)dump_exclude_table_dataadd--exclude-table-data={dump_exclude_table_data}to dump command (Default: ``)dump_exclude_extensionadd--exclude-extension={dump_exclude_extension}to dump command (Default: ``)
db-k8s.get-dump
Download db data from db pod if it present
Settings:
pod_namespacedb namespace (REQUIRED)pod_selectorpod selector for db (REQUIRED)get_pod_name_commandtemplate for fetching db pod (Default located in_config.pp > K8SDBSettings)dump_filename_templatetemplate for dump filename (Default:{project_name}-{env}-{timestamp:%Y-%m-%d}-db-dump.{extension})
cruft
Cruft is a tool used to synchronize changes with cookiecutter based boilerplates.
cruft.check-for-cruft-files
Check that there are no cruft files (*.rej).
cruft.create_project
Not invocation, but a shortcut for creating cruft projects for testing boilerplates
poetry
poetry.install
Install dependencies via poetry.
poetry.update
Update dependencies with respect to version constraints using poetry up plugin.
Fallbacks to poetry update in case of an error.
poetry.update-to-latest
Update dependencies to latest versions using poetry up plugin.
By default fallbacks to update task in case of an error.
Use --no-fallback to stop on error.
uv
uv.install
Install dependencies via uv.
uv.update
Update dependencies via uv.
pip
pip.install
Install dependencies via pip.
Settings:
dependencies_folderpath to folder with dependencies files (Default:requirements)
pip.compile
Compile dependencies via pip-compile.
Settings:
dependencies_folderpath to folder with dependencies files (Default:requirements)in_filessequence of.infiles (Default:"production.in","development.in")
mypy
mypy.run
Run mypy in path with params.
Settings:
mypy_entrypython entry command (Default:-m mypy)
pytest
pytest.run
Run pytest in path with params.
Settings:
pytest_entrypython entry command (Default:-m pytest)
secrets
secrets.setup-env-credentials
Fill specified credentials in your file from k8s.
This invocations downloads .env file from pod in k8s.
It will replace specified credentials(--credentials) in
specified file .env file (--env_file_path or .env as default)
Requires python-decouple
Settings for k8s:
secret_file_path_in_podpath to secret in pod (REQUIRED)temp_secret_file_pathpath for temporary file (Default:.env.to_delete)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file saritasa_invocations-1.9.0.tar.gz.
File metadata
- Download URL: saritasa_invocations-1.9.0.tar.gz
- Upload date:
- Size: 36.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
6601bd507f83811a4be1229a49afd3285fb40373d9e4c73a36fe77bf48d2041d
|
|
| MD5 |
2ccc674fc12e5ef1fdec55079347743d
|
|
| BLAKE2b-256 |
a159a4e0ffda43773a61f98547fe068c0f86ae1e08b8b73af78cfac6b5256f8a
|
Provenance
The following attestation bundles were made for saritasa_invocations-1.9.0.tar.gz:
Publisher:
publish.yml on saritasa-nest/saritasa-invocations
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
saritasa_invocations-1.9.0.tar.gz -
Subject digest:
6601bd507f83811a4be1229a49afd3285fb40373d9e4c73a36fe77bf48d2041d - Sigstore transparency entry: 808768530
- Sigstore integration time:
-
Permalink:
saritasa-nest/saritasa-invocations@2ef88f7ae360f7c81b943f74f7a371126fbe3e60 -
Branch / Tag:
refs/tags/1.9.0 - Owner: https://github.com/saritasa-nest
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@2ef88f7ae360f7c81b943f74f7a371126fbe3e60 -
Trigger Event:
release
-
Statement type:
File details
Details for the file saritasa_invocations-1.9.0-py3-none-any.whl.
File metadata
- Download URL: saritasa_invocations-1.9.0-py3-none-any.whl
- Upload date:
- Size: 38.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/6.1.0 CPython/3.13.7
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
837493fc7988b2b76cb6802960e36504afcaadcef1d47634773001754e3b7a64
|
|
| MD5 |
7b0e3ae5ca18b485c877c062ae59445c
|
|
| BLAKE2b-256 |
369a22f2817bd99925a669f82079d28abb5dc49c55a014e7826e75a159cb9af1
|
Provenance
The following attestation bundles were made for saritasa_invocations-1.9.0-py3-none-any.whl:
Publisher:
publish.yml on saritasa-nest/saritasa-invocations
-
Statement:
-
Statement type:
https://in-toto.io/Statement/v1 -
Predicate type:
https://docs.pypi.org/attestations/publish/v1 -
Subject name:
saritasa_invocations-1.9.0-py3-none-any.whl -
Subject digest:
837493fc7988b2b76cb6802960e36504afcaadcef1d47634773001754e3b7a64 - Sigstore transparency entry: 808768565
- Sigstore integration time:
-
Permalink:
saritasa-nest/saritasa-invocations@2ef88f7ae360f7c81b943f74f7a371126fbe3e60 -
Branch / Tag:
refs/tags/1.9.0 - Owner: https://github.com/saritasa-nest
-
Access:
public
-
Token Issuer:
https://token.actions.githubusercontent.com -
Runner Environment:
github-hosted -
Publication workflow:
publish.yml@2ef88f7ae360f7c81b943f74f7a371126fbe3e60 -
Trigger Event:
release
-
Statement type: