Skip to main content

Traffic generator based on locust and behave

Project description

Grizzly - /ˈɡɹɪzli/

grizzly logo
Framework

PyPI - License PyPI PyPI - Python Version

Command Line Interface

PyPI - License PyPI PyPI - Python Version

Editor Support / Language Server

PyPI - License PyPI PyPI - Python Version

Editor Support / Visual Studio Code Extension

GitHub License Visual Studio Marketplace Version (including pre-releases) Visual Studio Marketplace Release Date

Grizzly is a framework to be able to easily define load scenarios, and is primarily built on-top of two other frameworks.

Locust: Define user behaviour with Python code, and swarm your system with millions of simultaneous users.

Behave: Uses tests written in a natural language style, backed up by Python code.

behave is abused for being able to define locust load test scenarios using gherkin. A feature can contain more than one scenario and all scenarios will run in parallell. This makes it possible to implement load test scenarios without knowing python or how to use locust.

Locust are a group of certain species of short-horned grasshoppers in the family Arcididae that have a swarming phase.

The name grizzly was chosen based on the grasshopper Melanoplus punctulatus, also known as grizzly spur-throat grasshopper. This species prefers living in trees over grass, which is a hint to Biometria1, where grizzly originally was created.

1 Biometria is a member owned and central actor within the swedish forestry that performs unbiased measurement of lumber flowing between forest and industry so that all of Swedens forest owners can feel confident selling their lumber.

Documentation

More detailed documentation can be found here and the easiest way to get started is to check out the example.

Features

A number of features that we thought locust was missing out-of-the-box has been implemented in grizzly.

Test data

Support for synchronous handling of test data (variables). This is extra important when running locust distributed and there is a need for each worker and user to have unique test data, that cannot be re-used.

The solution is heavily inspired by Karol Brejnas locust experiments - feeding the locust. A producer is running on the master (or local) node and keeps track of what has been sent to the consumer running on a worker (or local) node. The two communicates over a dedicated ZeroMQ connection.

When the consumer wants new test data, it sends a message to the server that it is available and for which scenario it is going to run. The producer then responds with unique test data that can be used.

Statistics

Listeners for both InfluxDB and Azure Application Insights are included. The later is more or less appinsights_listener.py, from the good guys at Svenska Spel, but with typing.

They are useful when history of test runs is needed, or when wanting to correlate load tests with other events in the targeted environment.

Load test users

locust comes with a simple user for loading an HTTP(S) endpoint and due to the nature of how the integration between behave and locust works in grizzly, it is not possible to directly use locust.user.users provided users, even for HTTP(S) targets.

  • RestApiUser: send requests to REST API endpoinds, supports authentication with username+password or client secret
  • ServiceBusUser: send to and receive from Azure Service Bus queues and topics
  • MessageQueueUser: send and receive from IBM MQ queues
  • BlobStorageUser: send and receive files to Azure Blob Storage
  • IotHubUser: send/put files to Azure IoT Hub

Request log

All failed requests are logged to a file which includes both header and body, both for request and response.

Installation

pip3 install grizzly-loadtester
pip3 install grizzly-loadtester-cli

Do not forget to try the example which also serves as a boilerplate scenario project, or create a new grizzly project with:

grizzly-cli init my-grizzly-project

Development

The easiest way to start contributing to this project is to have Visual Studio Code (with "Remote - Containers" extension) and docker installed. The project comes with a devcontainer, which encapsulates everything needed for a development environment.

It is also possible to use a python virtual environment, but then you would have to manually download and install IBM MQ libraries, and install grizzly dependencies.

sudo mkdir /opt/mqm && cd /opt/mqm && wget https://ibm.biz/IBM-MQC-Redist-LinuxX64targz -O - | tar xzf -
export LD_LIBRARY_PATH="/opt/mqm/lib64:${LD_LIBRARY_PATH}"
cd ~/
git clone https://github.com/Biometria-se/grizzly.git
cd grizzly/
python -m venv .venv
source .venv/bin/activate
python -m pip install -e .[dev,ci,mq,docs]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

grizzly-loadtester-3.1.0.tar.gz (693.0 kB view details)

Uploaded Source

Built Distribution

grizzly_loadtester-3.1.0-py3-none-any.whl (250.6 kB view details)

Uploaded Python 3

File details

Details for the file grizzly-loadtester-3.1.0.tar.gz.

File metadata

  • Download URL: grizzly-loadtester-3.1.0.tar.gz
  • Upload date:
  • Size: 693.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.6

File hashes

Hashes for grizzly-loadtester-3.1.0.tar.gz
Algorithm Hash digest
SHA256 d72aeda6bbb4df382dd7f6c1e643925bfd5674b0d0991910fa7919bcc8b95c06
MD5 44c8c565a9898d2292f5d6640bd990b2
BLAKE2b-256 dd5f04261992fda974822b11fabd62eacde14123be5c3e57f5e90ccc7e19942f

See more details on using hashes here.

File details

Details for the file grizzly_loadtester-3.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for grizzly_loadtester-3.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 a7cccbba1f488772c14f193d06d926795f5eaf49b4ab251fd4502b7d39d7abb7
MD5 038e360d2db3e4d7d6de1aa5f2ccc1fa
BLAKE2b-256 45af4a26d7c679cc8b64947cd87e578cfcdf0b1746755f561fa462289a581290

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page