Skip to main content

A/B Testing for Wagtail

Project description

Wagtail A/B Testing

License: BSD-3-Clause Build status codecov Version Monthly downloads

Wagtail A/B Testing is an A/B testing package for Wagtail that allows users to create and manage A/B tests on pages through the Wagtail admin.

Key features:

  • Create an A/B test on any page from within Wagtail
  • Tests using page revisions (no need to create separate pages for the variants)
  • It prevents users from editing the page while a test is in progress
  • Calculates confidence using a Pearson's chi-squared test

Changelog

Usage

Wagtail A/B Testing works with Django 4.2+, Wagtail 5.2+ on Python 3.9+ environments.

Creating an A/B test

Any user with the "Create A/B test" permission can create an A/B test by clicking "Save and create A/B test" from the page's action menu.

The first page shows the user the difference between the content in the latest draft against the live version of the page. This allows them to check what changes on the page are going to be tested.

Once they've confirmed that, the user is taken to a form to insert the test name/hypothesis, select a goal, and sample size.

Screenshot of Wagtail A/B Testing create page

Monitoring test progress

While the test is running, the page's edit view gets replaced with a dashboard showing the current test progress. Users cannot edit the page until the test is completed or cancelled.

Any user with permission to publish the page can start, pause, resume or end A/B tests on that page.

Screenshot of Wagtail A/B Testing

Finishing the test

The test stops automatically when the number of participants reaches the sample size. Based on the results shown, a user must decide whether to publish the new changes or revert to the old version of the page.

Once they've chosen, the page edit view returns to normal. The results from this A/B test remain accessible under the A/B testing tab or from the A/B testing report.

Screenshot of Wagtail A/B Testing

Installation

Firstly, install the wagtail-ab-testing package from PyPI:

pip install wagtail-ab-testing

Then add it into INSTALLED_APPS:

INSTALLED_APPS = [
    # ...
    'wagtail_ab_testing',
    # ...
]

Then add the following to your URLconf:

from wagtail_ab_testing import urls as ab_testing_urls

urlpatterns = [
    ...

    path('abtesting/', include(ab_testing_urls)),
]

Finally, add the tracking script to your base HTML template:

{# Insert this at the top of the template #}
{% load wagtail_ab_testing_tags %}

...

{# Insert this where you would normally insert a <script> tag #}
{% wagtail_ab_testing_script %}

Implementing custom goal event types

Out of the box, Wagtail A/B testing provides a "Visit page" goal event type which you can use to track when users visit a goal page. It also supports custom goal types, which can be used for tracking other events such as making a purchase, submitting a form, or clicking a link.

To implement a custom goal event type, firstly register your type using the register_ab_testing_event_types hook, this would add your goal type to the list of options shown to users when they create A/B tests:

# myapp/wagtail_hooks.py

from wagtail import hooks
from wagtail_ab_testing.events import BaseEvent


class CustomEvent(BaseEvent):
    name = "Name of the event type"
    requires_page = True  # Set to False to create a "Global" event type that could be reached on any page

    def get_page_types(self):
        return [
            # Return a list of page models that can be used as destination pages for this event type
            # For example, if this 'event type' is for a 'call to action' button that only appears on
            # the homepage, put your `HomePage` model here.
        ]


@hooks.register('register_ab_testing_event_types')
def register_submit_form_event_type():
    return {
        'slug-of-the-event-type': CustomEvent,
    }

Next, you need to tell Wagtail A/B testing whenever a user triggers the goal. This can be done by calling wagtailAbTesting.triggerEvent() in the browser:

if (window.wagtailAbTesting) {
    wagtailAbTesting.triggerEvent('slug-of-the-event-type');
}

The JavaScript library tracks A/B tests using localStorage, so this will only call the server if the user is participating in an A/B test with the provided goal type and the current page is the goal page.

Example: Adding a "Submit form" event type

We will add a "Submit form" event type for a ContactUsFormPage page type in this example.

Firstly, we need to register the event type. To do this, implement a handler for the register_ab_testing_event_types hook in your app:

# myapp/wagtail_hooks.py

from wagtail import hooks
from wagtail_ab_testing.events import BaseEvent

from .models import ContactUsFormPage


class SubmitFormPageEvent(BaseEvent):
    name = "Submit form page"

    def get_page_types(self):
        # Only allow this event type to be used if he user has
        # selected an instance of `ContactUsFormPage` as the goal
        return [
            ContactUsFormPage,
        ]


@hooks.register('register_ab_testing_event_types')
def register_submit_form_event_type():
    return {
        'submit-contact-us-form': SubmitFormPageEvent,
    }

Next, we need to add some code to the frontend to trigger this event whenever a user submits the form:

# templates/forms/contact_us_form_page.html

<form id="form">
    ...
</form>

<script>
    if (window.wagtailAbTesting) {
        document.getElementById('form').addEventListener('submit', function() {
            wagtailAbTesting.triggerEvent('submit-contact-us-form');
        });
    }
</script>

Running A/B tests on a site that uses Cloudflare caching

To run Wagtail A/B testing on a site that uses Cloudflare, firstly generate a secure random string to use as a token, and configure that token in your Django settings file:

WAGTAIL_AB_TESTING_WORKER_TOKEN = '<token here>'

Then set up a Cloudflare Worker based on the following JavaScript:

// Set to false if Cloudflare shouldn't automatically redirect requests to use HTTPS
const ENFORCE_HTTPS = true;

export default {
    async fetch(request, env, ctx) {
        const url = new URL(request.url);

        // Set this to the domain name of your backend server
        const WAGTAIL_DOMAIN = env.WAGTAIL_DOMAIN;

        // This should match the token on your Django settings
        const WAGTAIL_AB_TESTING_WORKER_TOKEN =
            env.WAGTAIL_AB_TESTING_WORKER_TOKEN;

        if (url.protocol == 'http:' && ENFORCE_HTTPS) {
            url.protocol = 'https:';
            return Response.redirect(url, 301);
        }

        if (request.method === 'GET') {
            const newRequest = new Request(request, {
                headers: {
                    ...request.headers,
                    Authorization: 'Token ' + WAGTAIL_AB_TESTING_WORKER_TOKEN,
                    'X-Requested-With': 'WagtailAbTestingWorker',
                },
            });

            url.hostname = WAGTAIL_DOMAIN;
            response = await fetch(url.toString(), newRequest);

            // If there is a test running at the URL, the worker would return
            // a JSON response containing both versions of the page. Also, it
            // returns the test ID in the X-WagtailAbTesting-Test header.
            const testId = response.headers.get('X-WagtailAbTesting-Test');
            if (testId) {
                // Participants of a test would have a cookie that tells us which
                // version of the page being tested on that they should see
                // If they don't have this cookie, serve a random version
                const versionCookieName = `abtesting-${testId}-version`;
                const cookie = request.headers.get('cookie');
                let version;
                if (cookie && cookie.includes(`${versionCookieName}=control`)) {
                    version = 'control';
                } else if (
                    cookie &&
                    cookie.includes(`${versionCookieName}=variant`)
                ) {
                    version = 'variant';
                } else if (Math.random() < 0.5) {
                    version = 'control';
                } else {
                    version = 'variant';
                }

                return response.json().then((json) => {
                    return new Response(json[version], {
                        headers: {
                            ...response.headers,
                            'Content-Type': 'text/html',
                        },
                    });
                });
            }

            return response;
        } else {
            return await fetch(url.toString(), request);
        }
    },
};

You can use CloudFlare's wrangler to setup your worker. On an empty directory, install wrangler:

npm install wrangler --save-dev

and then initialise a new Wrangler project:

npx wrangler init

Follow the CLI prompt until it generates a project for you, then add the JS script above to src/index.js.

Add a WAGTAIL_AB_TESTING_WORKER_TOKEN variable to the worker, giving it the same token value that you generated earlier. Make sure to also setup a WAGTAIL_DOMAIN variable with the value of the domain where your website is hosted (e.g. "www.mysite.com").

Finally, add a route into Cloudflare so that it routes all traffic through this worker.

Contribution

Install

To make changes to this project, first fork this repository and clone it to your local system:

git clone link-to-your-forked-repo
cd wagtail-ab-testing

With your preferred virtualenv activated, install testing dependencies:

python -m pip install -e .[testing]

How to run tests

python testmanage.py test

Formatting and linting

We are using pre-commit to ensure that all code is formatted and linted before committing. To install the pre-commit hooks, run:

pre-commit install

The pre-commit hooks will run automatically before each commit. Or you can run them manually with:

pre-commit run --all-files

Credits

wagtail-ab-testing was originally created by Karl Hobley

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wagtail_ab_testing-0.12.tar.gz (151.9 kB view details)

Uploaded Source

Built Distribution

wagtail_ab_testing-0.12-py3-none-any.whl (164.8 kB view details)

Uploaded Python 3

File details

Details for the file wagtail_ab_testing-0.12.tar.gz.

File metadata

  • Download URL: wagtail_ab_testing-0.12.tar.gz
  • Upload date:
  • Size: 151.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.12.9

File hashes

Hashes for wagtail_ab_testing-0.12.tar.gz
Algorithm Hash digest
SHA256 56a3143a8e426eac1c76b9c517cdda1f40c45e37dfff6e836216fb3d7b79cf86
MD5 4bddb52c972f2f4dbc2b0f0dd94548ee
BLAKE2b-256 a451d0455ccdd540f21df35b3c8c9d05fcd8e88ccef12811ade23cb396370395

See more details on using hashes here.

Provenance

The following attestation bundles were made for wagtail_ab_testing-0.12.tar.gz:

Publisher: publish.yml on wagtail-nest/wagtail-ab-testing

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file wagtail_ab_testing-0.12-py3-none-any.whl.

File metadata

File hashes

Hashes for wagtail_ab_testing-0.12-py3-none-any.whl
Algorithm Hash digest
SHA256 219dc0aa1d0ee56556e109fce3a440394214e4c50128e25ad26e147d22969cf0
MD5 b1649a32d301755552dd091aa8e73521
BLAKE2b-256 a01c04dd102f9dd0351a0fddda1644da95ba12b48e5ed6bca809553a20562bdc

See more details on using hashes here.

Provenance

The following attestation bundles were made for wagtail_ab_testing-0.12-py3-none-any.whl:

Publisher: publish.yml on wagtail-nest/wagtail-ab-testing

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page