Skip to main content

Traffic generator based on locust and behave

Project description

Grizzly - /ˈɡɹɪzli/

grizzly logo
Framework

PyPI - License PyPI PyPI - Python Version

Command Line Interface

PyPI - License PyPI PyPI - Python Version

Editor Support / Language Server

PyPI - License PyPI PyPI - Python Version

Editor Support / Visual Studio Code Extension

GitHub License Visual Studio Marketplace Version (including pre-releases) Visual Studio Marketplace Release Date

Grizzly is a framework to be able to easily define load scenarios, and is primarily built on-top of two other frameworks.

Locust: Define user behaviour with Python code, and swarm your system with millions of simultaneous users.

Behave: Uses tests written in a natural language style, backed up by Python code.

behave is abused for being able to define locust load test scenarios using gherkin. A feature can contain more than one scenario and all scenarios will run in parallell. This makes it possible to implement load test scenarios without knowing python or how to use locust.

Locust are a group of certain species of short-horned grasshoppers in the family Arcididae that have a swarming phase.

The name grizzly was chosen based on the grasshopper Melanoplus punctulatus, also known as grizzly spur-throat grasshopper. This species prefers living in trees over grass, which is a hint to Biometria1, where grizzly originally was created.

1 Biometria is a member owned and central actor within the swedish forestry that performs unbiased measurement of lumber flowing between forest and industry so that all of Swedens forest owners can feel confident selling their lumber.

Documentation

More detailed documentation can be found here and the easiest way to get started is to check out the example.

Features

A number of features that we thought locust was missing out-of-the-box has been implemented in grizzly.

Test data

Support for synchronous handling of test data (variables). This is extra important when running locust distributed and there is a need for each worker and user to have unique test data, that cannot be re-used.

The solution is heavily inspired by Karol Brejnas locust experiments - feeding the locust. A producer is running on the master (or local) node and keeps track of what has been sent to the consumer running on a worker (or local) node. The two communicates over a dedicated ZeroMQ connection.

When the consumer wants new test data, it sends a message to the server that it is available and for which scenario it is going to run. The producer then responds with unique test data that can be used.

Statistics

Listeners for both InfluxDB and Azure Application Insights are included. The later is more or less appinsights_listener.py, from the good guys at Svenska Spel, but with typing.

They are useful when history of test runs is needed, or when wanting to correlate load tests with other events in the targeted environment.

Load test users

locust comes with a simple user for loading an HTTP(S) endpoint and due to the nature of how the integration between behave and locust works in grizzly, it is not possible to directly use locust.user.users provided users, even for HTTP(S) targets.

  • RestApiUser: send requests to REST API endpoinds, supports authentication with username+password or client secret
  • ServiceBusUser: send to and receive from Azure Service Bus queues and topics
  • MessageQueueUser: send and receive from IBM MQ queues
  • BlobStorageUser: send and receive files to Azure Blob Storage
  • IotHubUser: send/put files to Azure IoT Hub

Request log

All failed requests are logged to a file which includes both header and body, both for request and response.

Installation

pip3 install grizzly-loadtester
pip3 install grizzly-loadtester-cli

Do not forget to try the example which also serves as a boilerplate scenario project, or create a new grizzly project with:

grizzly-cli init my-grizzly-project

Development

The easiest way to start contributing to this project is to have Visual Studio Code (with "Remote - Containers" extension) and docker installed. The project comes with a devcontainer, which encapsulates everything needed for a development environment.

It is also possible to use a python virtual environment, but then you would have to manually download and install IBM MQ libraries, and install grizzly dependencies.

sudo mkdir /opt/mqm && cd /opt/mqm && wget https://ibm.biz/IBM-MQC-Redist-LinuxX64targz -O - | tar xzf -
export LD_LIBRARY_PATH="/opt/mqm/lib64:${LD_LIBRARY_PATH}"
cd ~/
git clone https://github.com/Biometria-se/grizzly.git
cd grizzly/
python -m venv .venv
source .venv/bin/activate
python -m pip install -e .[dev,ci,mq,docs]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

grizzly-loadtester-3.0.0.tar.gz (673.8 kB view details)

Uploaded Source

Built Distribution

grizzly_loadtester-3.0.0-py3-none-any.whl (235.8 kB view details)

Uploaded Python 3

File details

Details for the file grizzly-loadtester-3.0.0.tar.gz.

File metadata

  • Download URL: grizzly-loadtester-3.0.0.tar.gz
  • Upload date:
  • Size: 673.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.5

File hashes

Hashes for grizzly-loadtester-3.0.0.tar.gz
Algorithm Hash digest
SHA256 48fd66b5a8e8d71e6613464ab21f8a69d83f877b79fd373a74e2660df58bac55
MD5 bd1651355c07360ee38ea77e94eb729b
BLAKE2b-256 3a06a791f8c8e65f1f326c5fa7ba5a357cf385b99f69c86123d28e7c5b8770de

See more details on using hashes here.

File details

Details for the file grizzly_loadtester-3.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for grizzly_loadtester-3.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0e582a7a066b7b697db50c6e24924792335e85d0c69cb8044105172fffb928cf
MD5 f79aeff13b91006448ed7088cf2c1c34
BLAKE2b-256 2ed88c360c761320085bc6e0065d2c3e944936bee0d1e0cf313c627d59209f41

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page