Accelerator for pip, the Python package manager
The pip-accel program is a wrapper for pip, the Python package manager. It accelerates the usage of pip to initialize Python virtual environments given one or more requirements files. It does so by combining the following two approaches:
In addition, since version 0.9 pip-accel contains a simple mechanism that detects missing system packages when a build fails and prompts the user whether to install the missing dependencies and retry the build.
The pip-accel program is currently tested on cPython 2.6, 2.7, 3.4 and 3.5 and PyPy (2.7). The automated test suite regularly runs on Ubuntu Linux (Travis CI) as well as Microsoft Windows (AppVeyor). In addition to these platforms pip-accel should work fine on most UNIX systems (e.g. Mac OS X).
Paylogic uses pip-accel to quickly and reliably initialize virtual environments on its farm of continuous integration slaves which are constantly running unit tests (this was one of the original use cases for which pip-accel was developed). We also use it on our build servers.
When pip-accel was originally developed PyPI was sometimes very unreliable (PyPI wasn’t behind a CDN back then). Because of the CDN, PyPI is much more reliable nowadays however pip-accel still has its place:
The pip-accel command supports all subcommands and options supported by pip, however it is of course only useful for the pip install subcommand. So for example:
$ pip-accel install -r requirements.txt
Alternatively you can also run pip-accel as follows, but note that this requires Python 2.7 or higher (it specifically doesn’t work on Python 2.6):
$ python -m pip_accel install -r requirements.txt
If you pass a -v or --verbose option then pip and pip-accel will both use verbose output. The -q or --quiet option is also supported.
Based on the user running pip-accel the following file locations are used by default:
|Root user||All other users||Purpose|
|/var/cache/pip-accel||~/.pip-accel||Used to store the source/binary indexes|
This default can be overridden by defining the environment variable PIP_ACCEL_CACHE.
For most users the default configuration of pip-accel should be fine. If you do want to change pip-accel’s defaults you do so by setting environment variables and/or adding configuration options to a configuration file. This is because pip-accel shares its command line interface with pip and adding support for command line options specific to pip-accel is non trivial and may end up causing more confusion than it’s worth :-). For an overview of the available configuration options and corresponding environment variables please refer to the documentation of the pip_accel.config module.
To give you an idea of how effective pip-accel is, below are the results of a test to build a virtual environment for one of the internal code bases of Paylogic. This code base requires more than 40 dependencies including several packages that need compilation with SWIG and a C compiler:
|pip||Default configuration||444 seconds||100% (baseline)|
|pip||With download cache (first run)||416 seconds||94%|
|pip||With download cache (second run)||318 seconds||72%|
|pip-accel||First run||397 seconds||89%|
|pip-accel||Second run||30 seconds||7%|
Bundled with pip-accel are a local cache backend (which stores distribution archives on the local file system) and an Amazon S3 backend (see below).
Both of these cache backends are registered with pip-accel using a generic pluggable cache backend registration mechanism. This mechanism makes it possible to register additional cache backends without modifying pip-accel. If you are interested in the details please refer to pip-accel’s setup.py script and the two simple Python modules that define the bundled backends.
If you’ve written a cache backend that you think may be valuable to others, please feel free to open an issue or pull request on GitHub in order to get your backend bundled with pip-accel.
You can configure pip-accel to store its binary cache files in an Amazon S3 bucket. In this case Amazon S3 is treated as a second level cache, only used if the local file system cache can’t satisfy a dependency. If the dependency is not found in the Amazon S3 bucket, the package is built and cached locally (as usual) but then also saved to the Amazon S3 bucket. This functionality can be useful for continuous integration build worker boxes that are ephemeral and don’t have persistent local storage to store the pip-accel binary cache.
To get started you need to install pip-accel as follows:
$ pip install 'pip-accel[s3]'
The [s3] part enables the Amazon S3 cache backend by installing the Boto package. Once installed you can use the following environment variables to configure the Amazon S3 cache backend:
You can also set these options from a configuration file, please refer to the documentation of the pip_accel.config module. You will also need to set AWS credentials, either in a .boto file or in the $AWS_ACCESS_KEY_ID and $AWS_SECRET_ACCESS_KEY environment variables (refer to the Boto documentation for details).
If you want to point pip-accel at an S3 compatible storage service that is not Amazon S3 you can override the S3 API URL using a configuration option or environment variable. For example the pip-accel test suite first installs and starts FakeS3 and then sets PIP_ACCEL_S3_URL=http://localhost:12345 to point pip-accel at the FakeS3 server (in order to test the Amazon S3 cache backend without actually having to pay for an Amazon S3 bucket :-). For more details please refer to the documentation of the Amazon S3 cache backend.
Since version 0.38 pip-accel instructs setuptools to cache setup requirements in a subdirectory of pip-accel’s data directory (see the eggs_cache option) to avoid recompilation of setup requirements. This works by injecting a symbolic link called .eggs into unpacked source distribution directories before pip or pip-accel runs the setup script.
The use of the .eggs directory was added in setuptools version 7.0 which is why pip-accel now requires setuptools 7.0 or higher to be installed. This dependency was added because the whole point of pip-accel is to work well out of the box, shielding the user from surprising behavior like setup requirements slowing things down and breaking offline installation.
Since version 0.9 pip-accel contains a simple mechanism that detects missing system packages when a build fails and prompts the user whether to install the missing dependencies and retry the build. Currently only Debian Linux and derivative Linux distributions are supported, although support for other platforms should be easy to add. This functionality currently works based on configuration files that define dependencies of Python packages on system packages. This means the results should be fairly reliable, but every single dependency needs to be manually defined…
Here’s what it looks like in practice:
2013-06-16 01:01:53 wheezy-vm INFO Building binary distribution of python-mcrypt (1.1) .. 2013-06-16 01:01:53 wheezy-vm ERROR Failed to build binary distribution of python-mcrypt! (version: 1.1) 2013-06-16 01:01:53 wheezy-vm INFO Build output (will probably provide a hint as to what went wrong): gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -DVERSION="1.1" -I/usr/include/python2.7 -c mcrypt.c -o build/temp.linux-i686-2.7/mcrypt.o mcrypt.c:23:20: fatal error: mcrypt.h: No such file or directory error: command 'gcc' failed with exit status 1 2013-06-16 01:01:53 wheezy-vm INFO python-mcrypt: Checking for missing dependencies .. 2013-06-16 01:01:53 wheezy-vm INFO You seem to be missing 1 dependency: libmcrypt-dev 2013-06-16 01:01:53 wheezy-vm INFO I can install it for you with this command: sudo apt-get install --yes libmcrypt-dev Do you want me to install this dependency? [y/N] y 2013-06-16 01:02:05 wheezy-vm INFO Got permission to install missing dependency. The following extra packages will be installed: libmcrypt4 Suggested packages: mcrypt The following NEW packages will be installed: libmcrypt-dev libmcrypt4 0 upgraded, 2 newly installed, 0 to remove and 68 not upgraded. Unpacking libmcrypt4 (from .../libmcrypt4_2.5.8-3.1_i386.deb) ... Unpacking libmcrypt-dev (from .../libmcrypt-dev_2.5.8-3.1_i386.deb) ... Setting up libmcrypt4 (2.5.8-3.1) ... Setting up libmcrypt-dev (2.5.8-3.1) ... 2013-06-16 01:02:13 wheezy-vm INFO Successfully installed 1 missing dependency. 2013-06-16 01:02:13 wheezy-vm INFO Building binary distribution of python-mcrypt (1.1) .. 2013-06-16 01:02:14 wheezy-vm INFO Copying binary distribution python-mcrypt-1.1.linux-i686.tar.gz to cache as python-mcrypt:1.1:py2.7.tar.gz.
You can tell Tox to use pip-accel using a small shell script that first uses pip to install pip-accel, then uses pip-accel to bootstrap the virtual environment. You can find details about this in issue #30 on GitHub.
The way pip-accel works is not very intuitive but it is very effective. Below is an overview of the control flow. Once you take a look at the code you’ll notice that the steps below are all embedded in a loop that retries several times. This is mostly because of step 2 (downloading the source distributions).
- If the command succeeds it means all dependencies are already available as downloaded source distributions. We’ll parse the verbose pip output of step 1 to find the direct and transitive dependencies (names and versions) defined in requirements.txt and use them as input for step 3. Go to step 3.
- If the command fails it probably means not all dependencies are available as local source distributions yet so we should download them. Go to step 2.
- If the command fails it means that pip encountered errors while scanning PyPI, scanning a distribution website, downloading a source distribution or unpacking a source distribution. Usually these kinds of errors are intermittent so retrying a few times is worth a shot. Go to step 2.
- If the command succeeds it means all dependencies are now available as local source distributions; we don’t need the network anymore! Go to step 1.
If you have questions, bug reports, suggestions, etc. please create an issue on the GitHub project page. The latest version of pip-accel will always be available on GitHub. The internal API documentation is hosted on Read The Docs.