Skip to main content

High performance Python-based load testing tool

Project description

aiolocust

PyPI Python Version from PEP 621 TOML Downloads Build Status

This is a 2026 reimagining of the load testing tool Locust.

It has a ton of advantages over its predecessor, but is still in alpha and missing many of Locust's more advanced features. Do let us know if you find any major issues or want to contribute though!

Installation

We recommend using uv

uv tool install aiolocust
aiolocust

There are also some alternative ways to install.

Create a locustfile.py

import asyncio
from aiolocust import HttpUser

async def run(user: HttpUser):
    async with user.client.get("http://example.com/") as resp:
        pass
    async with user.client.get("http://example.com/") as resp:
        # extra validation, not just HTTP response code:
        assert "expected text" in await resp.text()
    await asyncio.sleep(0.1)

See more examples.

Run a test

aiolocust --duration 30 --users 100
 Name                   ┃  Count ┃ Failures ┃    Avg ┃    Max ┃       Rate
━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━╇━━━━━━━━━━╇━━━━━━━━╇━━━━━━━━╇━━━━━━━━━━━━
 http://example.com/    │ 120779 │ 0 (0.0%) │  1.6ms │ 22.6ms │ 60372.44/s
────────────────────────┼────────┼──────────┼────────┼────────┼────────────
 Total                  │ 120779 │ 0 (0.0%) │  1.6ms │ 22.6ms │ 60372.44/s

 Name                   ┃  Count ┃ Failures ┃    Avg ┃    Max ┃       Rate
━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━╇━━━━━━━━━━╇━━━━━━━━╇━━━━━━━━╇━━━━━━━━━━━━
 http://example.com     │ 243411 │ 0 (0.0%) │  1.6ms │ 22.6ms │ 60800.63/s
────────────────────────┼────────┼──────────┼────────┼────────┼────────────
 Total                  │ 243411 │ 0 (0.0%) │  1.6ms │ 22.6ms │ 60800.63/s
...
 Name                   ┃   Count ┃ Failures ┃    Avg ┃    Max ┃       Rate
━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━╇━━━━━━━━━━╇━━━━━━━━╇━━━━━━━━╇━━━━━━━━━━━━
 http://example.com/    │ 1836384 │ 0 (0.0%) │  1.6ms │ 22.6ms │ 61154.84/s
────────────────────────┼─────────┼──────────┼────────┼────────┼────────────
 Total                  │ 1836385 │ 0 (0.0%) │  1.6ms │ 22.6ms │ 61154.87/s

Why a rewrite instead of just expanding Locust?

Locust was created in 2011, and while it has gone through several major overhauls, it still has a lot of legacy-style code, and has accumulated a lot of non-core functionality that makes it very hard to maintain and improve. It has over 10,000 lines of code, with a mix of procedural, object oriented and functional programming, with several confusing abstractions.

aiolocust is built to be smaller in scope, but capture the learnings from Locust. It is possible that this could be merged into Locust at some point, but for now it is a completely separate package.

Simple and consistent syntax

Tests are expressed in modern, explicitly asynchronous code, instead of relying on gevent monkey patching, and implicit concurrency.

It has fewer "gotcha's" and better type hinting, that should make it easier for humans as well as AIs to understand and write tests.

We also plan to further emphasize the "It's just Python"-approach. For example, if you want to take precise control of the ramp up and ramp down of a test, you shouldn't need to read the documentation, you should only need to know how to write code. We'll still provide the option of using prebuilt features too of course, but we'll make an effort not to box users in, which was sometimes the case with Locust.

OTEL Native

aiolocust uses OTel for metrics internally and exporting them into your own monitoring solution is easy. By default, it creates a http.client.duration histogram.

If you also want to generate traces, logs and other standard metrics, you can either use the --instrument command line option, do it from code for increased flexibility, or use an agent for zero-code instrumentation.

aiolocust supports standard OTel env vars for exporter configuration, for example:

OTEL_TRACES_EXPORTER=console aiolocust --instrument

Here's a more complete example, for Splunk:

OTEL_METRIC_EXPORT_INTERVAL=1000 OTEL_EXPORTER_OTLP_TRACES_ENDPOINT="https://ingest.us1.signalfx.com/v2/trace/otlp" OTEL_EXPORTER_OTLP_METRICS_ENDPOINT="https://ingest.us1.signalfx.com/v2/datapoint/otlp" OTEL_EXPORTER_OTLP_HEADERS="X-SF-TOKEN=..." OTEL_EXPORTER_OTLP_METRICS_PROTOCOL="http" OTEL_METRIC_EXPORT_INTERVAL=500 aiolocust --instrument

High performance

aiolocust is more performant than "regular" Locust because it has a smaller footprint/complexity, but it's two main gains come from:

1. asyncio + aiohttp

aiolocust's performance is much better than HttpUser (based on python-requests), and even slightly better than FastHttpUser (based on geventhttpclient). Because it uses asyncio instead of monkey patching it allows you to use other asyncio libraries (like Playwright), which are becoming more and more common.

2. Freethreading/no-GIL Python

This means that you don't need to launch one Locust process per CPU core. And even if your scripts happen to do some heavy computations, they are less likely to impact each other, as one thread will not block Python from concurrently working on another one.

Users/threads can also communicate easily with each other, as they are in the same process, unlike in the old Locust implementation where you were forced to use ZeroMQ messaging between master and worker processes and worker-to-worker communication was nearly impossible.

Things this doesn't have compared do Locust (at least not yet)

  • A WebUI
  • Support for distributed tests

Alternative ways to install

If your tests need additional packages, or you want to structure your code in a complete Python project:

uv init --python 3.14t
uv add aiolocust
uv run aiolocust

Install for developing the tool itself, or just getting the latest changes before they make it into a release:

git clone https://github.com/cyberw/aiolocust.git
cd aiolocust
uv run aiolocust

You can still use good ol' pip as well, just remember that you need a freethreading Python build:

pip install aiolocust
aiolocust

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiolocust-0.5.0.tar.gz (14.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aiolocust-0.5.0-py3-none-any.whl (17.3 kB view details)

Uploaded Python 3

File details

Details for the file aiolocust-0.5.0.tar.gz.

File metadata

  • Download URL: aiolocust-0.5.0.tar.gz
  • Upload date:
  • Size: 14.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.10.12 {"installer":{"name":"uv","version":"0.10.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for aiolocust-0.5.0.tar.gz
Algorithm Hash digest
SHA256 84ac3c4f8c18d08ab1cab9d2575450d6d35ff1a6985bbc79748b75a82319b7fb
MD5 e47932f2073047904466bcca32ebfc0a
BLAKE2b-256 e9a689362cfef4eccecdda0f34463bc819966a31ff089b4a8553ca1a1635674d

See more details on using hashes here.

File details

Details for the file aiolocust-0.5.0-py3-none-any.whl.

File metadata

  • Download URL: aiolocust-0.5.0-py3-none-any.whl
  • Upload date:
  • Size: 17.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: uv/0.10.12 {"installer":{"name":"uv","version":"0.10.12","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for aiolocust-0.5.0-py3-none-any.whl
Algorithm Hash digest
SHA256 82325d6369e487b0f1d13c1a7510a826f8c4fe83b7786981f8d9b18f6f67357e
MD5 b0c1d36bd51b734d04136d75beeb3ce0
BLAKE2b-256 b1cdc261d313dd81596f6e53eb57821fdb27ef751d01656176bf218a9efe8870

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page