Python requirements compiler
Project description
New in version 1.0.0:
--hashes option for hashing the distributions used in a solution.
--urls option to dump the URL of the distribution downloaded for the solution into a comment.
--multiline and --no-multiline options to change how the solution is printed. When not provided the mode is automatically selected.
--no-explanations option to omit the constraint explanations from the comment following the pin in a solution.
req-candidates now starts printing candidates at matching versions to give a better view of what is available on the index.
--only-binary <project> option to force selecting wheels for the provided projects. Passing :all: will force req-compile to only use wheels.
--extra-index-url option to allow using the system index as the primary with extra indexes as supplements.
Allow requirements files to use --index-url and --extra-index-url directives.
--index-url and --extra-index-url are now emitted to the output if they differ from system default.
Download setup requirements to a provided wheel directory, even though they aren’t necessarily in the solution. This allows for fully offline source distribution installs via pip install <project> --no-index --find-links .wheeldir when all dependent projects correctly declare their setup requirements.
Improved errors when passing invalid requirements files.
New in version 0.10.21:
Allow for setuptools backend projects specified only by pyproject.toml and setup.cfg
README for Req-Compile Python Requirements Compiler
Req-Compile Python Requirements Compiler
Req-Compile is a Python requirements compiler geared toward large Python projects. It allows you to:
Produce an output file consisting of fully constrained exact versions of your requirements
Identify sources of constraints on your requirements
Constrain your output requirements using requirements that will not be included in the output
Save distributions that are downloaded while compiling in a configurable location
Use a current solution as a source of requirements. In other words, you can easily compile a subset from an existing solution.
Why use it?
pip and pip-tools are missing features and lack usability for some important workflows:
Using a previous solution as an input file to avoid hitting the network
pip-compile can’t consider constraints that are not included in the final output. While pip accepts a constraints file, there is no way to stop at the “solving” phase, which would be used to push a fully solved solution to your repo
Track down where conflicting constraints originate
Treating source directories recursively as sources of requirements, like with –find-links
Configuring a storage location for downloaded distributions. Finding a fresh solution to a set of input requirements always requires downloading distributions
A common workflow that is difficult to achieve with other tools:
You have a project with requirements requirements.txt and test requirements test-requirements.txt. You want to produce a fully constrained output of requirements.txt to use to deploy your application. Easy, right? Just compile requirements.txt. However, if your test requirements will in any way constrain packages you need, even those needed transitively, it means you will have tested with different versions than you’ll ship.
For this reason, you can use Req-Compile to compile requirements.txt using test-requirements.txt as constraints.
The Basics
Install and run
Req-Compile can be simply installed by running:
pip install req-compile
Two entrypoint scripts are provided:
req-compile <input reqfile1> ... <input_reqfileN> [--constraints constraint_file] [repositories, such as --index-url https://...] req-candidates [requirement] [repositories, such as --index-url https://...]
Producing output requirements
To produce a fully constrained set of requirements for a given number of input requirements files, pass requirements files to req-compile:
> cat requirements.txt astroid >= 2.0.0 isort >= 4.2.5 mccabe > req-compile req-compile requirements.txt astroid==2.9.0 # requirements.txt (>=2.0.0) isort==5.10.1 # requirements.txt (>=4.2.5) lazy-object-proxy==1.7.1 # astroid (>=1.4.0) mccabe==0.6.1 # requirements.txt setuptools==60.0.1 # astroid (>=20.0) typed-ast==1.5.1 # astroid (<2.0,>=1.4.0) typing_extensions==4.0.1 # astroid (>=3.10) wrapt==1.13.3 # astroid (<1.14,>=1.11)
Output is always emitted to stdout. Possible inputs include:
> req-compile > req-compile . # Compiles the current directory (looks for a setup.py or pyproject.toml) > req-compile subdir/project # Compiles the project in the subdir/project directory > req-candidates --paths-only | req-compile # Search for candidates and compile them piped in via stdin > echo flask | req-compile # Compile the requirement 'flask' using the default remote index (PyPI) > req-compile . --extra test # Compiles the current directory with the extra "test"
Specifying source of distributions
Req-Compile supports obtaining python distributions from multiple sources, each of which can be specified more than once. These are referred to as repositories. If a candidate can be found in a provided solution or source directory, the remaining repositories will not be considered. This is important for “lazy” requirement updates (e.g. only updating what is necessary to find a solution, and otherwise keep the existing solution) and ensure that source directories take precedence over remote repositories.
The following repositories can be specified:
--solution
Load a previous solution and use it as a source of distributions. This will allow a full recompilation of a working solution without requiring any other source. If the solution file can’t be found, a warning will be emitted but not cause a failure
--source
Use a local filesystem with source python packages to compile from. This will search the entire tree specified at the source directory, until an __init__.py is reached. --remove-source can be supplied to remove results that were obtained from source directories. You may want to do this if compiling for a project and only third party requirements compilation results need to be saved.
--find-links
Read a directory to load distributions from. The directory can contain anything a remote index would, wheels, zips, and source tarballs. This matches pip’s command line.
--index-url
URL of a remote index to search for packages in. When compiling, it’s necessary to download a package to determine its requirements. --wheel-dir can be supplied to specify where to save these distributions. Otherwise they will be deleted after compilation is complete. When specified, replaces the default index that is located in pip.conf/pip.ini on your system.
--extra-index-url
Extra remote index to search. Same semantics as index-url, but searched afterward. Additionally, does not replace the default index URL so it can be used as a supplemental source of requirements without knowing (or recording in the solution) the default index URL.
All options can be repeated multiple times, with the resolution order within solution and source matching what was passed on the commandline.
By default, PyPI (https://pypi.org/) or the default pip index is added as a default repository. It can be removed by passing --no-index on the commandline or passing a different index via --index-url.
Identifying source of constraints
Why did I just get version 1.11.0 of six? Find out by examining the output:
six==1.11.0 # astroid, pathlib2, pymodbus (==1.11.0), pytest (>=1.10.0), more_itertools (<2.0.0,>=1.0.0)
In the above output, the (==1.11.0) indicates that pymodbus, the requirement name listed before the parenthesis, specifically requested version 1.11.0 of six.
Constraining output
Constrain production outputs with test requirements using the --constraints flag. More than one file can be passed:
> cat requirements.txt astroid > cat test-requirements.txt pylint<1.6 > req-compile requirements.txt --constraints test-requirements.txt astroid==1.4.9 # pylint (<1.5.0,>=1.4.5), requirements.txt lazy-object-proxy==1.7.1 # astroid six==1.16.0 # astroid, pylint wrapt==1.13.3 # astroid
Note that astroid is constrained by pylint, even though pylint is not included in the output.
If a passed constraints file is fully pinned, Req-Compile will not attempt to find a solution for the requirements passed in the constraints files. This behavior only occurs if ALL of the requirements listed in the constraints files are pinned. This is because pinning a single requirement may still bring in transitive requirements that would affect the final solution. The heuristic of checking that all requirements are pinned assumes that you are providing a full solution.
Advanced Features
Compiling a constrained subset
Input can be supplied via stdin as well as via as through files. For example, to supply a full solution through a second compilation in order to obtain a subset of requirements, the following cmdline might be used:
> req-compile requirements.txt --constraints compiled-requirements.txt
or, for example to consider two projects together:
> req-compile /some/other/project /myproject | req-compile /myproject --solution -
which is equivalent to:
> req-compile /myproject --constraints /some/other/project
Resolving constraint conflicts
Conflicts will automatically print the source of each conflicting requirement:
> cat projectreqs.txt astroid<1.6 pylint>=1.5 > req-compile projectreqs.txt No version of astroid could possibly satisfy the following requirements (astroid<1.6,<3,>=2.3.0): projectreqs.txt -> astroid<1.6 projectreqs.txt -> pylint 2.4.1 -> astroid<3,>=2.3.0
Saving distributions
Files downloading during the compile process can be saved for later install. This can optimize the execution times of builds when a separate compile step is required:
> req-compile projectreqs.txt --wheel-dir .wheeldir > compiledreqs.txt > pip install -r compiledreqs.txt --find-links .wheeldir --no-index
Cookbook
Some useful patterns for projects are outlined below.
Compile, then install
After requirements are compiled, the usual next step is to install them into a virtualenv.
A script for test might run:
> req-compile --extra test --solution compiled-requirements.txt --wheel-dir .wheeldir > compiled-requirements.txt > pip-sync compiled-requirement.txt --find-links .wheeldir --no-index or > pip install -r compiled-requirements.txt --find-links .wheeldir --no-index
This would produce an environment containing all of the requirements and test requirements for the project in the current directory (as defined by a setup.py). This is a stable set, in that only changes to the requirements and constraints would produce a new output. To produce a totally fresh compilation, don’t pass in a previous solution.
The find-links parameter to the sync or pip install will reuse the wheels already downloaded by Req-Compile during the compilation phase. This will make the installation step entirely offline.
When taking this environment to deploy, trim down the set to the install requirements:
> req-compile --solution compiled-requirements.txt --no-index > install-requirements.txt
install-requirements.txt will contain the pinned requirements that should be installed in your target environment. The reason for this extra step is that you don’t want to distribute your test requirements, and you also want your installed requirements to be the same versions that you’ve tested with. In order to get all of your explicitly declared requirements and all of the transitive dependencies, you can use the prior solution to extract a subset. Passing the --no-index makes it clear that this command will not hit the remote index at all (though this would naturally be the case as solution files take precedence over remote indexes in repository search order).
Compile for a group of projects
Req-Compile can discover requirements that are grouped together on the filesystem. The req-candidates command will print discovered projects and with the --paths-only options will dump their paths to stdout. This allows recursive discovery of projects that you may want to compile together.
For example, consider a filesystem with this layout:
solution \_ utilities | \_ network_helper |_ integrations | \_ github \_ frameworks |_ neural_net \_ cluster
In each of the leaf nodes, there is a setup.py and full python project. To compile these together and ensure that their requirements will all install into the same environment:
> cd solution > req-candidates --paths-only /home/user/projects/solution/utilities/network_helper /home/user/projects/solution/integrations/github /home/user/projects/solution/frameworks/neural_net /home/user/projects/solution/frameworks/cluster > req-candidates --paths-only | req-compile --extra test --solution compiled-requirements.txt --wheel-dir .wheeldir > compiled-requirements.txt .. all reqs and all test reqs compiled together...
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file req-compile-1.0.0rc22.tar.gz
.
File metadata
- Download URL: req-compile-1.0.0rc22.tar.gz
- Upload date:
- Size: 70.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.9.19
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 39aa104a499a93ec6a55b965e8a6e91302c89b2c1e3cafb9ba5742bb99ecba3e |
|
MD5 | 1a48731a8e664521845bbda55914233a |
|
BLAKE2b-256 | 0e893e1443d426a634747467a10fb23b9e774d31d498276d0cbadafbedb1fbb5 |
File details
Details for the file req_compile-1.0.0rc22-py3-none-any.whl
.
File metadata
- Download URL: req_compile-1.0.0rc22-py3-none-any.whl
- Upload date:
- Size: 69.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.9.19
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d6c7594fd9a73b9237115b9069bf09ab77f598e623e3d68ab90f4a4ff3c3ee60 |
|
MD5 | 9e18191fd7f5ad3ad8ebbf88b3249392 |
|
BLAKE2b-256 | c2cfd8e7fefc93e230e7aab3bdafe8f23245e240ddf9e4f9eb8539945e27a529 |