Skip to main content

Fluent testing for Python

Project description

“When a failing test makes us read 20+ lines of test code, we die inside.” - C.J. Gaconnet

Version Downloads Status License

Why?

This is an attempt to make Python testing more readable while maintaining a Pythonic look and feel. As powerful and useful as the unittest module is, I’ve always disliked the Java-esque naming convention amongst other things.

While truly awesome, attempts to bring BDD to Python never feel Pythonic. Most of the frameworks that I have seen rely on duplicated information between the specification and the test cases. My belief is that we need something closer to what RSpec offers but one that feels like Python.

Where?

How?

fluenttest.test_case.TestCase implements the Arrange, Act, Assert method of testing. The configuration for the test case and the execution of the single action under test is run precisely once per test case instance. The test case contains multiple assertions each in its own method. The implementation leverages existing test case runners such as nose and py.test. In order to run the arrange and act steps once per test, fluenttest calls arrange and act from within the setUpClass class method. Each assertion is then written in its own test method. The following snippet rewrites the simple example from the Python Standard library unittest documentation:

import random
import unittest

class TestSequenceFunctions(unittest.TestCase):
   def setUp(self):
      self.seq = list(range(10))

   def test_shuffle(self):
      # make sure the shuffled sequence does not lose any elements
      random.shuffle(self.seq)
      self.seq.sort()
      self.assertEqual(self.seq, list(range(10)))

      # should raise an exception for an immutable sequence
      self.assertRaises(TypeError, random.shuffle, (1, 2, 3))

This very simple test looks like the following when written using fluenttest. Notice that the comments in the original test really pointed out that there were multiple assertions buried in the test method. This is much more explicit with fluenttest:

import random
import unittest

from fluenttest import test_case

class WhenShufflingSequence(test_case.TestCase, unittest.TestCase):
    @classmethod
    def arrange(cls):
        super(WhenShufflingSequence, cls).arrange()
        cls.input_sequence = list(range(10))
        cls.result_sequence = cls.input_sequence[:]

    @classmethod
    def act(cls):
        random.shuffle(cls.result_sequence)

    def test_should_not_loose_elements(self):
        self.assertEqual(sorted(self.result_sequence),
                         sorted(self.input_sequence))

class WhenShufflingImmutableSequence(test_case.TestCase, unittest.TestCase):
    allowed_exceptions = TypeError

    @classmethod
    def act(cls):
        random.shuffle((1, 2, 3))

    def test_should_raise_type_error(self):
        self.assertIsInstance(self.exception, TypeError)

The fluenttest version is almost twice the length of the original so brevity is not a quality to expect from this style of testing. The first thing that you gain is that the comments that explained what each test is doing is replace with very explicit code. In this simplistic example, the gain isn’t very notable. Look at the tests directory for a realistic example of tests written in this style.

Contributing

Contributions are welcome as long as they follow a few basic rules:

  1. They start out life by forking the central repo and creating a new branch off of master.

  2. All tests pass and coverage is at 100% - make test

  3. All quality checks pass - make lint

  4. Issue a pull-request through github.

Development Environment

Like many other projects, the development environment is contained in a virtual environment and controlled by a Makefile. The inclusion of make is less than perfect, but it is the easiest way to bootstrap a project on just about any platform. Start out by cloning the repository with git and building a virtual environment to work with:

$ git clone https://github.com/my-org/fluent-test.git
$ cd fluent-test
$ make environment

This will create a Python 3 environment in the env directory using mkvenv and install the various prerequisites such as pip and nose. You can activate the environment source source env/bin/activate, launch a Python interpreter with env/bin/python, and run the test suite with env/bin/nosetests.

The Makefile exports a few other useful targets:

  • make test: run the tests

  • make lint: run various static analysis tools

  • make clean: remove cache files

  • make mostly-clean: remove built and cached eggs

  • make dist-clean: remove generated distributions

  • make maintainer-clean: remove virtual environment

  • make sdist: create a distribution tarball

  • make docs: build the HTML documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

Fluent-Test-3.0.0.tar.gz (13.4 kB view details)

Uploaded Source

Built Distribution

Fluent_Test-3.0.0-py2.py3-none-any.whl (8.4 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file Fluent-Test-3.0.0.tar.gz.

File metadata

  • Download URL: Fluent-Test-3.0.0.tar.gz
  • Upload date:
  • Size: 13.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No

File hashes

Hashes for Fluent-Test-3.0.0.tar.gz
Algorithm Hash digest
SHA256 d94e284feced4d7cf7366314f5e3c5a453c5e3318e239082e26a468874bb907f
MD5 9f57ebdcd5d36eb1268d093f67e00131
BLAKE2b-256 664245b65aed20e6a19864c3266b35ca61433d48199d12e473a03b8e45e6462b

See more details on using hashes here.

File details

Details for the file Fluent_Test-3.0.0-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for Fluent_Test-3.0.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 850a8ebce195d7bac545cf693e4affca96eedb5f70833eb35bcbaf464daf9b75
MD5 36f1fa5e56caf8d78f530f37fe862ed3
BLAKE2b-256 59a6ebaa54a84615df79052d0fb465962f16762939caa6630af20e664be9a40b

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page