Skip to main content

Aspects plugin for Tutor

Project description

Aspects Learner Analytics combines several free, open source, tools to add analytics and reporting capabilities to the Open edX platform. This plugin offers easy installation, configuration, and deployment of these tools using Tutor. The tools Aspects uses are:

  • ClickHouse, a fast, scalable analytics database that can be run anywhere

  • Apache Superset, a data visualization platform and data API

  • OpenFUN Ralph, a Learning Record store (and more) that can validate and store xAPI statements in ClickHouse

  • Vector, a log forwarding tool that can be used to forward tracking log and xAPI data to ClickHouse

  • event-routing-backends, an Open edX plugin that transforms tracking logs into xAPI and optionally forwards them to one or more Learning Record Stores in near real time

  • event-sink-clickhouse, an Open edX plugin that exports course structure and high level data to ClickHouse at publish time

  • dbt, a tool to build data pipelines from SQL queries. The dbt project used by this plugin is aspects-dbt.

See https://github.com/openedx/openedx-aspects for more details about the Aspects architecture and high level documentation.

Aspects is a community developed effort combining the Cairn project by Overhang.io and the OARS project by EduNEXT, OpenCraft, and Axim Collaborative.

Note: Aspects is beta and not yet production ready! Please feel free to experiment with the system and offer feedback about what you’d like to see by adding Issues in this repository. Current details on the beta progress can be found here: https://openedx.atlassian.net/wiki/spaces/COMM/pages/3861512203/Aspects+Beta

Compatibility

This plugin is compatible with Tutor 15.0.0 and later and is expected to be compatible with Open edX releases from Nutmeg forward.

Installation

Aspects is implemented as a Tutor plugin. Documentation will be coming soon to cover how to install Aspects in non-Tutor environments, but by far the easiest way to try and install it is via Tutor. These instructions assume you are running a tutor local install, which is the fastest and easiest way to get started.

  1. Install Tutor: https://docs.tutor.overhang.io/install.html#install

  2. Create an admin user on the LMS: https://docs.tutor.overhang.io/whatnext.html#logging-in-as-administrator

  3. Install the Aspects plugin (in your Tutor Python environment):

    pip install tutor-contrib-aspects
  4. Enable the plugins:

    tutor plugins enable aspects
  5. Save the changes to the environment:

    tutor config save
  6. Because we’re installing new applications in LMS (event-routing-backends, event-sink-clickhouse) you will need to rebuild your openedx Docker image:

    tutor images build openedx --no-cache
  7. Build the Aspects-flavored Superset image to bake your settings (such as database passwords) into the Superset assets:

    tutor images build aspects-superset
  8. Run the initialization scripts:

    tutor local do init

At this point you should have a working Tutor / Aspects environment, but with no way to create data! There are a few options for how to proceed.

  1. If you would just like to see some data populated in the charts without loading a real course in the LMS you can create test data in the database (use --help for usage):

    tutor local do load-xapi-test-data
  2. OR Load the test course and generate real data from the LMS:

    1. https://docs.tutor.overhang.io/whatnext.html#importing-a-demo-course

    2. Log into the LMS with your admin user and enroll / proceed through the demo course

  3. OR If you are adding Aspects to an existing LMS that already has data

    1. Sink course data from the LMS to clickhouse (see https://github.com/openedx/openedx-event-sink-clickhouse for more information):

      tutor local do dump-data-to-clickhouse --options "--object course_overviews"
    2. Sink Historical event data to ClickHouse:

      tutor [dev|local] do transform_tracking_logs \
        --source_provider LOCAL --source_config '{"key": "/openedx/data", "container":
           "logs", "prefix": "tracking.log"}' \
        --transformer_type xapi
      
      # Note that this will work only for default tutor installation. If you store your tracking logs any other way, you need to change the source_config option accordingly.
      # See https://event-routing-backends.readthedocs.io/en/latest/howto/how_to_bulk_transform.html#sources-and-destinations for details on how to change the source_config option.
  4. If your assets have changed since the last time you ran init, you will need to rebuild the aspects-superset image and re-import the assets:

    tutor images build aspects-superset --no-cache
    tutor local do import-assets
  5. Make sure to build and push your Superset image in the following cases:

    1. If you have made changes to the Superset assets.

    2. If you have made changes to the Clickhouse/DBT schema.

    3. If you are using custom translations.

You should now have data to look at in Superset! Log in to https://superset.local.overhang.io/ with your admin account and you should see charts with your data.

Superset Assets

Aspects maintains the Superset assets in this repository, specifically the dashboards, charts, datasets, and databases. That means that any updates made here will be reflected on your Superset instance when you update your deployment.

But it also means that any local changes you make to these assets will be overwritten when you update your deployment. To prevent your local changes from being overwritten, please create new assets and make your changes there instead. You can copy an existing asset by editing the asset in Superset and selecting “Save As” to save it to a new name.

# Note: If you are using custom assets you will need to rebuild your aspects-superset # image on your local machine with tutor images build aspects-superset –no-cache.

Sharing Charts and Dashboards

To share your charts with others in the community, use Superset’s “Export” button to save a zip file of your charts and related datasets.

To import charts or dashboards shared by someone in the community:

  1. Expand the zip file and look for any files added under databases. Update the sqlalchemy_uri to match your database’s connection details.

  2. Compress the files back into a .zip file.

  3. On the Charts or Dashboards page, use the “Import” button to upload your .zip file.

Contributing Charts and Dashboards to Aspects

The Superset assets provided by Aspects can be found in the templated tutoraspects/templates/aspects/build/aspects-superset/openedx-assets/assets/ directory. For the most part, these files are what Superset exports, but with some crucial differences which make these assets usable across all Tutor deployments.

To contribute assets to Aspects:

  1. Fork this repository and have a locally running Tutor set up with this plugin installed.

  2. Export the assets you want to contribute as described in Sharing Charts and Dashboards

  3. Run the command: tutor aspects import_superset_zip ~/Downloads/your_file.zip

  4. This command will copy the files from your zip to the assets directory and attempt to warn you if there are hard coded connection settings where it expects template variables. These are usually in database and dataset assets, and those are often assets that already exist. The warnings look like:

    WARN: fact_enrollments.yaml has schema set to reporting instead of a setting.

  5. Check the diff of files and update any database connection strings or table names to use Tutor configuration template variables instead of hard-coded strings, e.g. replace clickhouse with {{CLICKHOUSE_HOST}}. Passwords can be left as {{CLICKHOUSE_PASSWORD}}, though be aware that if you are adding new databases, you’ll need to update SUPERSET_DB_PASSWORDS in the init scripts. Here is the default connection string for reference:

    ``clickhousedb+connect://{{CLICKHOUSE_REPORT_URL}}``
  6. You will likely also run into issues where our SQL templates have been expanded into their actual SQL. If you haven’t changed the SQL of these queries (stored in tutoraspects/templates/openedx-assets/queries you can just revert that change back to their include values such as: sql: “{% include ‘openedx-assets/queries/fact_enrollments_by_day.sql’ %}”

  7. The script will also warn about missing _roles in dashboards. Superset does not export these, so you will need to manually add this key with the roles that are necessary to view the dashboard. See the existing dashboards for how this is done.

  8. Re-build your aspects-superset image with tutor images build aspects-superset –no-cache

  9. Run the command tutor aspects check_superset_assets to confirm there are no duplicate assets, which can happen when you rename an asset, and will cause import to fail. The command will automatically delete the older file if it finds a duplicate.

  10. Check that everything imports correctly by running tutor local do import-assets and confirming there are no errors.

  11. Double check that your database password did not get exported before committing!

  12. Commit and submit a PR with screenshots of your new chart or dashboards, along with an explanation of what data question they answer.

Virtual datasets in Superset

Superset supports creating virtual datasets, which are datasets defined using a SQL query instead of mapping directly to an underlying database object. Aspects leverages virtual datasets, along with SQL templating, to make better use of table indexes.

To make it easier for developers to manage virtual datasets, there is an extra step that can be done on the output of tutor aspects serialize. The sql section of the dataset yaml can be moved to its own file in the queries directory and included in the yaml like so:

sql: "{% include 'openedx-assets/queries/query.sql' %}"

However, please keep in mind that the assets declaration is itself a jinja template. That means that any jinja used in the dataset definition should be escaped. There are examples of how to handle this in the existing queries, such as dim_courses.sql.

Releasing tutor-contrib-aspects

Changelog, package version, PyPI release, and image building are all handled via manually triggered Githib Actions.

To trigger a build you must have access to manually trigger the “Bump version and changelog” action. This will update the version and changelog in a new PR. If the PR looks good, you can approve and merge it. Merging this PR will:

  • Trigger the “release” workflow which will tag a Github release with the new version number, and then push the release to PyPI

  • Trigger the “build-image” workflow, which builds our images for aspects, aspects-superset, and openedx to the EduNEXT DockerHub repositories

When the workflows are finished you should confirm that you see the new version on PyPI and images in DockerHub.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

tutor_contrib_aspects-1.1.0.tar.gz (230.4 kB view details)

Uploaded Source

Built Distribution

tutor_contrib_aspects-1.1.0-py3-none-any.whl (394.5 kB view details)

Uploaded Python 3

File details

Details for the file tutor_contrib_aspects-1.1.0.tar.gz.

File metadata

  • Download URL: tutor_contrib_aspects-1.1.0.tar.gz
  • Upload date:
  • Size: 230.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.5

File hashes

Hashes for tutor_contrib_aspects-1.1.0.tar.gz
Algorithm Hash digest
SHA256 edaf14a66be6ca6c0fa94f305b36b43cee02077d4d6745c3d5383bfa6534c549
MD5 b1e52c1930d33a50300435ebb98d7cac
BLAKE2b-256 9f0186777d9c9ec83a868314eea2946717aa412636ba37c6a70ed0d3a314cea6

See more details on using hashes here.

File details

Details for the file tutor_contrib_aspects-1.1.0-py3-none-any.whl.

File metadata

File hashes

Hashes for tutor_contrib_aspects-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 efd6a2bb6b87928e4b8c964e9452cc60172be6f82d434aa1e0d0691bf5cf8963
MD5 b1c416d808284cbcbb25e66088724d5d
BLAKE2b-256 f4668144d0d196c9bbc71dd39fb866ac6c9d1f80b2a1868f26950ac19b774c67

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page