Compile multiple requirements files to lock dependency versions
Project description
Compile multiple requirements files to lock dependency versions
Installation
pip install pip-compile-multi
Basic Usage
pip-compile-multi
Example scenario
I will start from the very basics of dependency management and will go very slow, so if you feel bored, just scroll to the next section.
Suppose you have a python project with following direct dependencies:
click
pip-tools
(Yes I took pip-compile-multi as an example). Let’s save them as-is in requirements/base.in. Those are unpinned libraries, it means that whenever developer runs
pip install -r requirements/base.in
he will get some version of these libraries. And chances are that if several developers do the same over some period in time, some will have different dependency versions than others. Also, if the project is online service, one day it may stop working after redeployment because some of the dependencies had backward incompatible release. That kind of releases is more common than a newbie can think. More or less every package with a version higher than 2.0.0 had them.
To avoid this problem Python developers are hard-pinning (aka locking) their dependencies. So instead of a list of libraries, they have something like:
click==6.7
pip-tools==1.11.0
(To keep things neat let’s put this into requirements/base.txt) That’s good for a starter. But there are two important drawbacks:
Developers have to do non-trivial operations if they want to keep up with newer versions (that have bug fixes and performance improvements).
Indirect dependencies (that is dependencies of dependencies) may still have backward-incompatible releases, that brake everything.
Let’s put aside point 1 and fight point 2. Let’s do
pip freeze > requirements/base.txt
Now we have full heirarchy of dependencies hard-pinned:
click==6.7
first==2.0.1
pip-tools==1.11.0
six==1.11.0
That’s great, and solves the main problem - service will be deployed exactly [1] the same every single time and all developers will have identical environments.
This case is so common, that there already is a number of tools to solve it. Two worth mentioning are:
Pip Tools - a mature package that is enhanced by pip-compile-multi.
PipEnv - fresh approach that is going to become Python standard way of locking dependencies some day.
But what if the project uses some packages that are not required by the service itself? For example pytest, that is needed to run unit tests, but should never be deployed to a production site. Or flake8 - syntax checking tool. If they are installed in the current virtual environment, they will get into pip freeze output. That’s no good. And removing them manually from requirements/base.txt is not an option. But still, these packages must be pinned to ensure, that tests are running the same way by the whole team (and build server).
So let’s get hands dirty and put all the testing stuff into requirements/test.in:
-r base.in
prospector
pylint
flake8
mock
six
Note, how I put -r base.in in the beginning, so that test dependencies are installed along with the base.
Now installation command is
pip install -e requirements/test.in
For one single time (exceptionally to show how unacceptable is this task) let’s manually compose requirements/test.txt. After installation, run freeze to bring the whole list of all locked packages:
$ pip freeze
astroid==1.6.0
click==6.7
dodgy==0.1.9
first==2.0.1
flake8==3.5.0
flake8-polyfill==1.0.2
isort==4.2.15
lazy-object-proxy==1.3.1
mccabe==0.6.1
mock==2.0.0
pbr==3.1.1
pep8-naming==0.5.0
pip-tools==1.11.0
prospector==0.12.7
pycodestyle==2.0.0
pydocstyle==2.1.1
pyflakes==1.6.0
pylint==1.8.1
pylint-celery==0.3
pylint-common==0.2.5
pylint-django==0.7.2
pylint-flask==0.5
pylint-plugin-utils==0.2.6
PyYAML==3.12
requirements-detector==0.5.2
setoptconf==0.2.0
six==1.11.0
snowballstemmer==1.2.1
wrapt==1.10.11
Wow! That’s quite a list! But we remember what goes into base.txt:
click
first
pip-tools
six
Good, everything else can be put into requirements/test.txt. But wait, six is included in test.in and is missing in test.txt. That feels wrong… Ah, it’s because we’ve moved six to the base.txt. It’s good, that we didn’t forget, that it should be in base. We might forget next time though.
Why don’t we automate it? That’s what pip-compile-multi is for.
Managing dependency versions in multiple environments
Let’s rehearse, example service has two groups of dependencies (or, as I call them, environments):
$ cat requirements/base.in
click
pip-tools
$ cat requirements/test.in
-r base.in
prospector
pylint
flake8
mock
six
To make automation even more appealing, let’s add one more environment. I’ll call it local - things that are needed during development, but are not required by tests, or service itself.
$ cat requirements/local.in
-r test.in
tox
Now we want to put all base dependencies along with all their recursive dependencies into base.txt, all recursive test dependencies except for base into test.txt, and all recursive local dependencies except for base and test into local.txt.
$ pip-compile-multi
INFO:pip-compile-multi:Locking requirements/base.in to requirements/base.txt
INFO:pip-compile-multi:Locking requirements/test.in to requirements/test.txt
INFO:pip-compile-multi:Locking requirements/local.in to requirements/local.txt
Yes, that’s right. All the tedious dependency versions management job done with a single command, that doesn’t even have options.
Now you can run git diff to review the changes and git commit to save them. To install the new set of versions run:
pip install -Ur requirements/local.txt
It’s a perfect time to run all the tests and make sure, that updates were backward compatible enough for your needs. More often than I’d like in big projects, it’s not so. Let’s say new version of pylint dropped support of old Python version, that you still need to support. Than you open test.in and soft-pin it with descriptive comment:
$ cat requirements/test.in
-r base.in
prospector
pylint<1.8 # Newer versions dropped support for Python 2.4
flake8
mock
six
I know, this example is made up. But you get the idea. That re-run pip-compile-multi to compile new test.txt and check new set.
Benefits of using pip-compile-multi
I want to summarise, why you need to start using pip-compile-multi. Some of the benefits are achievable with other methods, but I want to be general:
Production will not suddenly brake after redeployment because of backward incompatible dependency release.
The whole team will use the same package versions and see the same outcomes. No more “works for me” and “I can not reproduce this” [2].
Service still uses most recent versions of packages. And fresh means best here.
Dependencies are upgraded when the time is suitable for the service, not whenever they are released.
Different environments are separated into different files.
*.in files are small and manageable because they store only direct dependencies.
*.txt files are exhaustive and precise (but you don’t need to edit them).
Have a question? Need a feature? Fill free to open an issue on GitHub.
[1] That’s not really true. Someone could re-upload broken package under existing version on PyPI.
[2] Yeah, yeah, there are still a lot of ways to have these problems.
History
1.1.0 (2017-01-12)
Added files discovery.
1.0.0 (2017-01-11)
First release on PyPI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for pip_compile_multi-1.1.2-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 4711d9737a4254e86d8a9cc9eb7288f4513f905be3eae93ca1fadfd67ebbcf76 |
|
MD5 | 3cd40ab35c7b5e8d8744353c892f2cc3 |
|
BLAKE2b-256 | 1f5d5e8e41b3982da1892ad42568e24be84d50154e7927d043873da0c4fdc2d7 |