Skip to main content

Run tests written with unittest against a specified module.

Project description

Module Interface

Run tests written with unittest against a specified module. Additional options:

  • location of tests (if not in the current directory),
  • fallback module to run the tests against,
  • where to save a CSV of the test results, and
  • config file to specify how results are processed
usage: grader [-h] [--fallback FALLBACK] [--submission SUBMISSION] [--tests TESTS]
              [--test-pattern TEST_PATTERN] [--output OUTPUT] [--log LOG] [--config CONFIG]
              path

positional arguments:
  path                  Module to grade

optional arguments:
  -h, --help            show this help message and exit
  --fallback FALLBACK   Fallback module to grade
  --submission SUBMISSION
                        Submission name to grade.
  --tests TESTS         Path of tests to run. Defaults to ./
  --test-pattern TEST_PATTERN
                        Test name pattern to match. Defaults to "test*.py"
  --output OUTPUT       Output file for report. Defaults to stdout.
  --log LOG             Log file to use. Defaults to stdout.
  --config CONFIG       Config file to use.

Tests Discovery

Tests to run can be located anywhere using a combination of the --tests and --test-pattern args. By default, they are searched for under the current directory and match the test*.py pattern. Tests are discovered using the unittest module.

Running Tests

While tests are running, the script will change to the directory of path. This means that file locations in tests should be relative to the provided path. --fallback helps the module determine all tests that should be run. This is helpful when the code located at path is untrusted and may totally fail the test discovery step.

Config

The grader config can be specified with the --config arg. It is a json file that can specify what happens when a certain test runs or when a certain submission is graded. Individual test configs should be under the "tests" map and the key should be of the form: testFunction (test_filename.TestCaseName). Inividual submission configs should be under the "submissions" map and match the name of the submission. An example format is below:

{
    "tests": {
        "testSimple1 (test_simple.TestSimpleTestCase)": {
            "name": "Simple Test 1",
            "weight": 2
        }
    },
    "names": {
        "submission_name": {
            "name": "custom label here"
        }
    }
}

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

autograde.py-0.0.1.tar.gz (4.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

autograde.py-0.0.1-py3-none-any.whl (4.7 kB view details)

Uploaded Python 3

File details

Details for the file autograde.py-0.0.1.tar.gz.

File metadata

  • Download URL: autograde.py-0.0.1.tar.gz
  • Upload date:
  • Size: 4.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.13

File hashes

Hashes for autograde.py-0.0.1.tar.gz
Algorithm Hash digest
SHA256 57f67fdd47ac3953a2d0000bc41b2529f4dd501cfa9e9b2f5a2b1b4c6fef6849
MD5 6c77987af0479778bcf133ff2c0cb546
BLAKE2b-256 05889bfee3a0a76a067453070be3e3b08ab72a3df18c63d869ccf0f30f75365f

See more details on using hashes here.

File details

Details for the file autograde.py-0.0.1-py3-none-any.whl.

File metadata

  • Download URL: autograde.py-0.0.1-py3-none-any.whl
  • Upload date:
  • Size: 4.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.13

File hashes

Hashes for autograde.py-0.0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 de77d67a6d2e4a25e7eda7e520ef275f4019df9b4591be4025caea581f1ea3dd
MD5 059b92488dab7ca542beb674c0e49f1d
BLAKE2b-256 e8052efa5b894317b459a382e577f3a9cba697e7d61eeed0654d50bb1b2caf3b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page