Skip to main content

System test framework over POSIX shells

Project description

Prego is a system/integration test framework running as Python unittest testcases.

Prego is a library consisting on a set of clases and hamcrest matchers usefull to specify shell command interactions through files, environment variables, network ports. It provides support to run shell commands on background, send signal to processes, set assertions on command stdout or stderr, etc.


First: a Task() is a set of assertions.

Three assertion checkers are available:

  • task.assert_that, for single shot checking.
  • task.wait_that, for polling recurrent checking.
  • task.command, to run arbitrary shell command.

Subjects (and their associated assertions):

  • Task(desc=’’, detach=False)
    • command(cmd_line, stdout, stderr, expected, timeout, signal, cwd, env)
    • running()
    • terminated()
  • File(path)
    • exists()
  • File().content
    • any hamcrest string matchers (ie: contains_string)
  • Variable
    • exists()
    • any hamcrest string matchers (ie: contains_string)
  • Command
    • running()
    • exits_with(value)
    • killed_by(signal)
  • Host(hostname)
    • listen_port(number, proto=’tcp’)
    • reachable()

Execution model



The context is an object whose attributes may be automatically interpolated in command and filename paths.

Some of them are set as default values for command() parameters too. If context.cwd is set, all commands in the same test method will use that value as CWD (Current Working Directory) unless you define a different value as command() keyarg.

Context attributes that defaults command() parameters are cwd, timeout, signal and expected.


Available interpolation variables are:

  • $basedir: the directory where prego is executed (relative).
  • $fullbasedir: absolute path of $basedir.
  • $testdir: the directory where the running test file is.
  • $fulltestdir: absolute path of $testdir.
  • $testfilename: the file name of the running test.
  • $tmpbase: a safe directory (per user) to put temporary files.
  • $tmp: a safe directory (per user and prego instance) to put temporary files.
  • $pid: the prego instance PID.


Testing ncat

import hamcrest
from prego import Task, TestCase, context as ctx, running
from import localhost, listen_port
from prego.debian import Package, installed

class Net(TestCase):
    def test_netcat(self):
        ctx.port = 2000
        server = Task(desc='ncat server', detach=True)
        server.assert_that(Package('nmap'), installed())
        cmd = server.command('ncat -l -p $port')

        client = Task(desc='ncat client')
        client.wait_that(server, running())
        client.wait_that(localhost, listen_port(ctx.port))
        client.command('ncat -c "echo bye" localhost $port')

This test may be executed using nosetest:

$ nosetests examples/
Ran 1 test in 1.414s


But prego provides a wrapper (the prego command) that has some interesting options:

$ prego -h
usage: prego [-h] [-c FILE] [-k] [-d] [-o] [-e] [-v] [-p] ...

positional arguments:

optional arguments:
  -h, --help            show this help message and exit
  -c FILE, --config FILE
                        explicit config file
  -k, --keep-going      continue even with failed assertion or tests
  -d, --dirty           do not remove generated files
  -o, --stdout          print tests stdout
  -e, --stderr          print tests stderr
  -v, --verbose         increase log verbosity

Same ncat test invoking prego:

[II] ------  Net.test_netcat BEGIN
[II] [ ok ]   B.0 wait that A is running
[II] [ ok ]   A.0 assert that nmap package is installed
[II] [ ok ]   A.1 assert that localhost not port 2000/tcp to be open
[II] [fail]   B.1 wait that localhost port 2000/tcp to be open
[II] [ ok ]   B.1 wait that localhost port 2000/tcp to be open
[II]          A.2.out| bye
[II] [ ok ]   B.2 Command 'ncat -c "echo bye" localhost 2000' code (0:0) time 5:1.28
[II] [ ok ]   B.3 assert that command B.2 returncode to be 0
[II] [ ok ]   B.4 assert that command B.2 execution time to be a value less than <5>s
[II] [ OK ]   B   Task end - elapsed: 1.17s
[II] [ ok ]   A.2 Command 'ncat -l -p 2000' code (0:0) time 5:1.33
[II] [ ok ]   A.3 assert that command A.2 returncode to be 0
[II] [ ok ]   A.4 assert that command A.2 execution time to be a value less than <5>s
[II] [ ok ]   A.5 assert that File '/tmp/prego-david/26245/A.2.out' content a string containing 'bye'
[II] [ OK ]   A   Task end - elapsed: 1.32s
[II] [ OK ]  Net.test_netcat END
Ran 1 test in 1.396s


Testing reachability

import hamcrest
from prego import TestCase, Task
from import Host, reachable

class GoogleTest(TestCase):
    def test_is_reachable(self):
        link = Task(desc="Is interface link up?")
        link.command('ip link | grep wlan0 | grep "state UP"')

        router = Task(desc="Is the local router reachable?")
        router.command("ping -c2 $(ip route | grep ^default | cut -d' ' -f 3)")

        for line in file('/etc/resolv.conf'):
            if line.startswith('nameserver'):
                server = line.split()[1]
                test = Task(desc="Is DNS server {0} reachable?".format(server))
                test.command('ping -c 2 {0}'.format(server))

        resolve = Task(desc="may google name be resolved?")

        ping = Task(desc="Is google reachable?")
        ping.command('ping -c 1')
        ping.assert_that(Host(''), reachable())
        ping.assert_that(Host(''), hamcrest.is_not(reachable()))

        web = Task(desc="get index.html")
        cmd = web.command('wget -O-')
                        hamcrest.contains_string('value="I\'m Feeling Lucky"'))

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for prego3, version 0.20181031
Filename, size File type Python version Upload date Hashes
Filename, size prego3-0.20181031.tar.gz (24.6 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page