Skip to main content

Automated marking of student electronic submissions

Project description

Automated marking and feedback

Applicability

This toolset is applicable for all assessments (coursework, practical work, exams and class tests ) where students are required to submit their work electronically and where the submission can be analysed electronically e.g.

  • Software which can be tested using unit testing (in any language).
  • Embedded Systems which can be tested using mock libraries and headers instead of the hardware
  • Digital designs using a hardware description language (e.g. VHDL) whic can be tested in simulation or synthesis
  • Data files which can be read and analysed - these might br created from simulations, or experimental measurements or data recorded from test instruments
  • Spreadsheet files which might be used to capture student readings and answers to questions.

The toolset is written in Python and can run on almost any Platform - PCs, Linux or Mac. Users will need to isntall other software needed for the actual tests such as C compilers for host and embedded C applications, VHDL simulators and synthesis tools, Python libraries etc.

Process and Coverage

The toolset automates the following steps

  1. Collation of student work into folders for a cohort/assessment. This may be from a Blackboard dump of files and archives from grade centre or (preferred) the cloning and pulling of student work from github classroom repositories.

  2. The collation and use of student information from a provided csv file (e.g. downloaded from Blackboard)

  3. Checking of student submission against a manifest of expected files (provided)

  4. The collation of a set of tests to be run against the students work - these can be almost anything (see applicability above).

  5. The automated running of the tests across a cohort or set of students amd the collection of the results into a feedback report.

  6. The generation of a marking template based on the set of tests - this may then be added to as necessary to integrate different marking schemes and non-automated marking results.

  7. The generation of completed marking sheets for the students (using the report data and a template marking sheet)

  8. The sending out by email of the marking sheets and reports to students (work in progress)

  9. The automated filling in of a csv file from the marking sheet for sending marks to the office.

The tools

usage: pyAutoMark [-h] {run,retrieve,extract,mark,generate-template,check-submission,find-duplicates,config} ...

Automatically retrieve, mark and provide feedback for digital student submissions

optional arguments:
  -h, --help            show this help message and exit

subcommands:
  {run,retrieve,extract,mark,generate-template,check-submission,find-duplicates,config}
    run                 Run automated tests and generate reports
    retrieve            Retrieve student files from github
    extract             Extract student files from downloads
    mark                Generate mark spreadsheets from reports and template spreadsheet
    generate-template   Generate a template spreadsheet
    check-submission    Check students have submitted files listed in manifest
    find-duplicates     Find duplicate students files
    config              Set or read configration

Requirements

Software

Software Min Version Link
Visual Studio Code 1.74.2 https://code.visualstudio.com/
Python 3.10.9 https://www.python.org/downloads/windows/
pytest 7.2.1 pip install pytest
git 2.39.0.windows.2 https://gitforwindows.org/
clang-tidy (Optional) 10.0 https://learn.microsoft.com/en-us/cpp/code-quality/clang-tidy?view=msvc-170

Common ComponentsLibraries

Python Libraries Version
openpyxl 3.1.2
pylint (Optional) 2.4.4
pytest-timeout (Optional)
VSCode Extensions Author
C/C++ Microsoft
C/C++ Extension Pack Microsoft
Python Test Explorer Little Fox Team
Github Classroom Github

Documentation

See https://willijar.github.io/pyAutoMark/

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pyAutoMark-0.8.2.tar.gz (60.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pyAutoMark-0.8.2-py3-none-any.whl (72.0 kB view details)

Uploaded Python 3

File details

Details for the file pyAutoMark-0.8.2.tar.gz.

File metadata

  • Download URL: pyAutoMark-0.8.2.tar.gz
  • Upload date:
  • Size: 60.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.12

File hashes

Hashes for pyAutoMark-0.8.2.tar.gz
Algorithm Hash digest
SHA256 de03570ebe4f017bc93f433faf5ae0e5a201e56ef0cc8007de540ab4ccece6e1
MD5 a1dc52682fbd7a4ad78915d129d2dcf7
BLAKE2b-256 2aab6062e355c9a0770389ef608de42953a1e117465314bec50b2881b287d34a

See more details on using hashes here.

File details

Details for the file pyAutoMark-0.8.2-py3-none-any.whl.

File metadata

  • Download URL: pyAutoMark-0.8.2-py3-none-any.whl
  • Upload date:
  • Size: 72.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.12

File hashes

Hashes for pyAutoMark-0.8.2-py3-none-any.whl
Algorithm Hash digest
SHA256 eb01f5676a9e3c7c5f1eb7f5433e967936ed71e45149238e1b4bf451a9c98061
MD5 eaa1f4372c70c3c4bd63e4b76db0bd7f
BLAKE2b-256 61ec2c10b749aa46ef7692ff92d211abb2350fa5925d9c5b0d51edab1cc60659

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page