Collection of invoke commands used by Saritasa
- Installation
- Configuration
- Modules
- printing
- system
- git
- pre-commit
- docker
- github-actions
- python
- django
- django.manage
- django.makemigrations
- django.migrate
- django.resetdb
- django.createsuperuser
- django.run
- django.shell
- django.dbshell
- django.django.recompile-messages
- django.show-urls
- django.load-db-dump
- django.backup-local-db
- django.backup-remote-db
- django.load-remote-db
- django.startapp
- django.wait-for-database
- fastapi
- alembic
- celery
- open-api
- db
- k8s
- db-k8s
- cruft
- poetry
- uv
- pip
- mypy
- pytest
- secrets
pip install saritasa-invocationsor if you are using poetry
poetry add saritasa-invocationsor if you are using uv
uv add saritasa-invocationsYou can use uvx to use packages globally without installing them or activating virtualenvs.
uvx saritasa-invocations pre-commit.run-hooksOr if you need extras
uvx --from="saritasa-invocations[env_settings]" pre-commit.run-hooksOr simply create alias for simple usage
alias saritasa-inv="uvx saritasa-invocations"
saritasa-inv pre-commit.run-hooksConfiguration can be set in tasks.py file.
Below is an example of config:
import invoke
import saritasa_invocations
ns = invoke.Collection(
saritasa_invocations.docker,
saritasa_invocations.git,
saritasa_invocations.github_actions,
saritasa_invocations.pre_commit,
saritasa_invocations.system,
)
# Configurations for run command
ns.configure(
{
"run": {
"pty": True,
"echo": True,
},
"saritasa_invocations": saritasa_invocations.Config(
pre_commit=saritasa_invocations.PreCommitSettings(
hooks=(
"pre-commit",
"pre-push",
"commit-msg",
)
),
git=saritasa_invocations.GitSettings(
merge_ff="true",
pull_ff="only",
),
docker=saritasa_invocations.DockerSettings(
main_containers=(
"opensearch",
"redis",
),
),
system=saritasa_invocations.SystemSettings(
vs_code_settings_template=".vscode/recommended_settings.json",
settings_template="config/.env.local",
save_settings_from_template_to="config/.env",
),
# Default K8S Settings shared between envs
k8s_defaults=saritasa_invocations.K8SDefaultSettings(
proxy="teleport.company.com",
db_config=saritasa_invocations.K8SDBSettings(
namespace="db",
pod_selector="app=pod-selector-db",
),
)
),
},
)
# For K8S settings you just need to create a instances of K8SSettings for each
# environnement. It'll be all collected automatically.
saritasa_invocations.K8SSettings(
name="dev",
cluster="teleport.company.somewhere.com",
namespace="project_name",
)
saritasa_invocations.K8SSettings(
name="prod",
cluster="teleport.client.somewhere.com",
namespace="project_name",
proxy="teleport.client.com",
)While this module doesn't contain any invocations, it's used to print message
via rich.panel.Panel. There are three types:
print_success- print message in green panelprint_warning- print message in yellow panelprint_error- print message in red panel
Copies local template for settings into specified file
Settings:
settings_templatepath to settings template (Default:config/settings/local.template.py)save_settings_from_template_topath to where save settings (Default:config/settings/local.py)
Copies local template for vscode settings into .vscode folder
Settings:
vs_code_settings_templatepath to settings template (Default:.vscode/recommended_settings.json)
Change ownership of files to user(current user by default).
Shortcut for owning apps dir by specified user after some files were generated using docker-compose (migrations, new app, etc).
Create folder for temporary files(.tmp).
Set git setting in config
Preform setup of git:
- Install pre-commit hooks
- Set merge.ff
- Set pull.ff
Settings:
merge_ffsetting value formerge.ff(Default:false)pull_ffsetting value forpull.ff(Default:only)
Clone repo or pull latest changes to specified repo
Command for creating copies of a file with git blame history saving.
Original script written in bash
Usage:
inv git.blame-copy <path to original file> <path to copy>,<path to copy>...If <path to copy> is file, then data will be copied in it.
If <path to copy> is directory, then data will be copied in provided
directory with original name.
Algorithm:
- Remember current HEAD state
- For each copy path:
move file to copy path, restore file using
checkout, remember result commits - Restore state of branch
- Move file to temp file
- Merge copy commits to branch
- Move file to it's original path from temp file
Settings:
copy_commit_templatetemplate for commits created during command workflowcopy_init_message_templatetemplate for init message printed at command start
Template variables:
action- The copy algorithm consists of several intermediate actions (creating temporary files, merging commits, etc.) Theactionvariable stores the header of the intermediate action.original_path- Contains value of first argument of the command (path of original file that will be copied)destination_paths- Sequence of paths to which the original file will be copiedproject_task- project task that will be parsed from current git branch. If no task found in branch, then will be empty
Default values for templates:
copy_commit_template:
"[automated-commit]: {action}\n\n"
"copy: {original_path}\n"
"to:\n* {destination_paths}\n\n"
"{project_task}"copy_init_message_template:
"Copy {original_path} to:\n"
"* {destination_paths}\n\n"
"Count of created commits: {commits_count}"Settings to run .pre-commit-config.yaml
entry- command used to runpre-commitor alternatives (Default:"pre-commit")default_hook_stage- default hook stage to run hooks (Default:"push")
Install git hooks via pre-commit.
Uninstall git hooks via pre-commit.
Run all hooks against all files.
Update pre-commit dependencies.
Build service image from docker compose
Build project via pack-cli
Settings:
buildpack_builderimage tag of builder (Default:paketobuildpacks/builder:base)buildpack_runnerimage tag of runner (Default:paketobuildpacks/run:base)build_image_tagimage tag of builder (Default: Name of project fromproject_name)buildpack_requirements_pathpath to folder with requirements (Default:requirements)
Shortcut for stopping ALL running docker containers
Bring up main containers and start them.
Settings:
main_containersimage tag of builder (Default:["postgres", "redis"])
Stop main containers.
Settings:
main_containersimage tag of builder (Default:["postgres", "redis"])
Stop and remove all containers defined in docker-compose. Also remove images.
Add hosts to /etc/hosts.
Settings:
hostsimage tag of builder (Default: seedocker-main-containers)
As of now we support two environments for python local and docker.
localis a python that is located in your current virtualenvdockeris python that is located inside your docker image of service (python_docker_service).
This was done to have ability to run code against environment close deployed one or simply test it out.
Example of usage
PYTHON_ENV=docker inv python.run --command="--version"Run python command depending on PYTHON_ENV variable(docker or local).
Settings:
entrypython entry command (Default:python)docker_servicepython service name (Default:web)docker_service_paramsparams for docker (Default:--rm)
Run manage.py with specified command.
This command also handle starting of required services and waiting DB to be ready.
Requires django_probes
Settings:
manage_file_pathpath tomanage.pyfile (Default:./manage.py)
Run makemigrations command and chown created migrations (only for docker env).
Check if there is new migrations or not. Result should be check via exit code.
Run migrate command.
Settings:
migrate_commandmigrate command (Default:migrate)
Reset database to initial state (including test DB).
Requires django-extensions
Settings:
settings_pathdefault django settings (Default:config.settings.local)
Create superuser.
Settings:
default_superuser_emaildefault email of superuser. if empty, will try to grab it from git config, before resorting to default (Default:root@localhost)default_superuser_usernamedefault username of superuser if empty, will try to grab it from git config, before resorting to default (Default:root)default_superuser_passworddefault password of superuser (Default:root)verbose_email_nameverbose name foremailfield (Default:Email address)verbose_username_nameverbose name forusernamefield (Default:Username)verbose_password_nameverbose name forpasswordfield (Default:Password)
Note:
- Values for
verbose_email_name,verbose_username_name,verbose_password_nameshould match with verbose names of model that used this setting
Run development web-server.
Settings:
runserver_docker_paramsparams for docker (Default:--rm --service-ports)runserver_commandrunserver command (Default:runserver_plus)runserver_hosthost of server (Default:0.0.0.0)runserver_portport of server (Default:8000)runserver_paramsparams for runserver command (Default:"")
Shortcut for manage.py shell command.
Settings:
shell_commandcommand to start python shell (Default:shell_plus --ipython)
Open database shell with credentials from current django settings.
Generate and recompile translation messages.
Requires gettext
Settings:
makemessages_paramsparams for makemessages command (Default:--all --ignore venv)compilemessages_paramsparams for compilemessages command (Default:"")
Show urls of project which can be filtered via search parameter.
Reset db and load db dump.
Uses resetdb and load-db-dump
Settings:
django_settings_pathdefault django settings (Default:config.settings.local)
Back up local db.
Uses backup_local_db
Settings:
settings_pathdefault django settings (Default:config.settings.local)
Make dump of remote db and download it.
Uses create_dump and get-dump
It can use usual django config where every setting is stored in separate variable or single variable with full db url.
Settings:
-
settings_pathdefault django settings (Default:config.settings.local) -
remote_db_url_config_nameName of config for db url (Default:DATABASE_URL) -
remote_db_config_mappingMapping of db config Default:{ "dbname": "RDS_DB_NAME", "host": "RDS_DB_HOST", "port": "RDS_DB_PORT", "username": "RDS_DB_USER", "password": "RDS_DB_PASSWORD", }
Make dump of remote db and download it and apply to local db.
Uses create_dump and get-dump and load-db-dump
Settings:
settings_pathdefault django settings (Default:config.settings.local)
Create django app from a template using copier.
Requires uv: installation docs
Settings:
app_boilerplate_linklink to app templateapps_pathpath to apps folder in project (Default:apps)
Launch docker compose and wait for database connection.
Run development web-server.
Settings:
docker_paramsparams for docker (Default:--rm --service-ports)uvicorn_commanduvicorn command (Default:-m uvicorn)apppath to fastapi app (Default:config:fastapi_app)hosthost of server (Default:0.0.0.0)portport of server (Default:8000)paramsparams for uvicorn (Default:--reload)
Run alembic command
Settings:
commandalembic command (Default:-m alembic)connect_attemptsnumbers of attempts to connect to database (Default:10)
Generate migrations
Settings:
migrations_foldermigrations files location (Default:db/migrations/versions)
Upgrade database
Downgrade database
Check if there any missing migrations to be generated
Check migration files for adjust messages
Settings:
migrations_foldermigrations files location (Default:db/migrations/versions)adjust_messageslist of alembic adjust messages (Default:# ### commands auto generated by Alembic - please adjust! ###,# ### end Alembic commands ###)
Reset db and load db dump.
Uses downgrade and load-db-dump
Requires python-decouple
Installed with [env_settings]
Settings:
-
db_config_mappingMapping of db configDefault:
{ "dbname": "rds_db_name", "host": "rds_db_host", "port": "rds_db_port", "username": "rds_db_user", "password": "rds_db_password", }
Back up local db.
Uses backup_local_db
Requires python-decouple
Installed with [env_settings]
Settings:
-
db_config_mappingMapping of db configDefault:
{ "dbname": "rds_db_name", "host": "rds_db_host", "port": "rds_db_port", "username": "rds_db_user", "password": "rds_db_password", }
Make dump of remote db and download it.
Uses create_dump and get-dump
Requires python-decouple
Installed with [env_settings]
Settings:
-
db_config_mappingMapping of db configDefault:
{ "dbname": "rds_db_name", "host": "rds_db_host", "port": "rds_db_port", "username": "rds_db_user", "password": "rds_db_password", }
Make dump of remote db and download it and apply to local db.
Uses create-dump and get-dump and load-db-dump
Requires python-decouple
Installed with [env_settings]
Settings:
-
db_config_mappingMapping of db configDefault:
{ "dbname": "rds_db_name", "host": "rds_db_host", "port": "rds_db_port", "username": "rds_db_user", "password": "rds_db_password", }
Launch docker compose and wait for database connection.
Start celery worker.
Settings:
apppath to app (Default:config.celery.app)schedulerscheduler (Default:django)loglevellog level for celery (Default:info)extra_paramsextra params for worker (Default:("--beat",))local_cmdcommand for celery (Default:celery --app {app} worker --scheduler={scheduler} --loglevel={info} {extra_params})service_namename of celery service (Default:celery)
Send task to celery worker.
Settings:
apppath to app (Default:config.celery.app)
Check that generated open_api spec is valid. This command uses drf-spectacular and it's default validator. It creates spec file in ./tmp folder and then validates it.
Load db dump to local db.
Settings:
load_dump_commandtemplate for load command(Default located in_config.pp > dbSettings)dump_filenamefilename for dump (Default:local_db_dump)load_additional_paramsadditional params for load command (Default:--quite)
Back up local db.
Settings:
dump_commandtemplate for dump command (Default located in_config.pp > dbSettings)dump_filenamefilename for dump (Default:local_db_dump)dump_additional_paramsadditional params for dump command (Default: ``)dump_no_owneradd--no-ownerto dump command (Default:True)dump_include_tableadd--table={dump_include_table}to dump command (Default: ``)dump_exclude_tableadd--exclude-table={dump_exclude_table}to dump command (Default: ``)dump_exclude_table_dataadd--exclude-table-data={dump_exclude_table_data}to dump command (Default: ``)dump_exclude_extensionadd--exclude-extension={dump_exclude_extension}to dump command (Default: ``)
For K8S settings you just need to create a instances of K8SSettings for each
environnement. It'll be all collected automatically.
Login into k8s via teleport.
Settings:
proxyteleport proxy (REQUIRED)clusterkube cluster (Default: Uses value fromproxy)portteleport port (Default:443)authteleport auth method (Default:github)
Set k8s context to current project. By default uses dev environment.
Settings:
namespacenamespace for k8s (Default: Name of project fromproject_name)contextName of context (REQUIRED)
Get logs for k8s pod
Settings:
default_componentdefault component (Default:backend)
Get pods from k8s.
Execute command inside k8s pod.
Say you have Procfile with this entry
celery_start_task: celery --app config.celery:app call ${task}
So you can make this invocation
inv k8s.execute --entry="celery_start_task" --env-params="task=apps.project.tasks.do_the_thing"Or create you own invocation which could lead to something like this
inv project.k8s_start_celery_task --task=apps.project.tasks.do_the_thingSettings:
default_componentdefault component (Default:backend)default_entrydefault entry cmd (Default:/cnb/lifecycle/launcher)default_commanddefault cmd entry for entry cmd (Default:bash) only used fordefault_entry
Enter python shell inside k8s pod.
Settings:
default_componentdefault component (Default:backend)python_shellshell cmd (Default:shell_plus)
Check health of component.
Settings:
default_componentdefault component (Default:backend)health_checkhealth check cmd (Default:health_check)
Download file from pod.
default_componentdefault component (Default:backend)
While you probably won't use this module directly some other modules commands are use it(getting remote db dump)
Make sure to set up these configs:
pod_namespacedb namespace (REQUIRED)pod_selectorpod selector for db (REQUIRED)
Execute dump command in db pod.
Settings:
pod_namespacedb namespace (REQUIRED)pod_selectorpod selector for db (REQUIRED)get_pod_name_commandtemplate for fetching db pod (Default located in_config.pp > K8SdbSettings)dump_filename_templatetemplate for dump filename (Default:{project_name}-{env}-{timestamp:%Y-%m-%d}-db-dump.{extension})dump_commanddump command template (Default located in_config.pp > K8SDBSettings)dump_dirfolder where to put dump file (Default:tmp)dump_additional_paramsadditional params for dump command (Default: ``)dump_no_owneradd--no-ownerto dump command (Default:True)dump_include_tableadd--table={dump_include_table}to dump command (Default: ``)dump_exclude_tableadd--exclude-table={dump_exclude_table}to dump command (Default: ``)dump_exclude_table_dataadd--exclude-table-data={dump_exclude_table_data}to dump command (Default: ``)dump_exclude_extensionadd--exclude-extension={dump_exclude_extension}to dump command (Default: ``)
Download db data from db pod if it present
Settings:
pod_namespacedb namespace (REQUIRED)pod_selectorpod selector for db (REQUIRED)get_pod_name_commandtemplate for fetching db pod (Default located in_config.pp > K8SDBSettings)dump_filename_templatetemplate for dump filename (Default:{project_name}-{env}-{timestamp:%Y-%m-%d}-db-dump.{extension})
Cruft is a tool used to synchronize changes with cookiecutter based boilerplates.
Check that there are no cruft files (*.rej).
Not invocation, but a shortcut for creating cruft projects for testing boilerplates
Install dependencies via poetry.
Update dependencies with respect to version constraints using poetry up plugin.
Fallbacks to poetry update in case of an error.
Update dependencies to latest versions using poetry up plugin.
By default fallbacks to update task in case of an error.
Use --no-fallback to stop on error.
Install dependencies via uv.
Update dependencies via uv.
Install dependencies via pip.
Settings:
dependencies_folderpath to folder with dependencies files (Default:requirements)
Compile dependencies via pip-compile.
Settings:
dependencies_folderpath to folder with dependencies files (Default:requirements)in_filessequence of.infiles (Default:"production.in","development.in")
Run mypy in path with params.
Settings:
mypy_entrypython entry command (Default:-m mypy)
Run pytest in path with params.
Settings:
pytest_entrypython entry command (Default:-m pytest)
Fill specified credentials in your file from k8s.
This invocations downloads .env file from pod in k8s.
It will replace specified credentials(--credentials) in
specified file .env file (--env_file_path or .env as default)
Requires python-decouple
Settings for k8s:
secret_file_path_in_podpath to secret in pod (REQUIRED)temp_secret_file_pathpath for temporary file (Default:.env.to_delete)