Skip to main content
This is a pre-production deployment of Warehouse. Changes made here affect the production instance of PyPI (
Help us improve Python packaging - Donate today!

Command line tool for convenient access to Coursera Research Data Exports.

Project Description

This project is a library consisting of a command line interface and a client for interacting with Coursera’s research exports. Up to date documentation of the data provided by Coursera for research purposes is available on gitbooks , Coursera Data Exports Guide.


To install this package, execute:

sudo pip install courseraresearchexports

pip is a python package manager. If you do not have pip installed on your machine, please follow the installation instructions for your platform.

Note: the containers subcommand requires docker to already be installed on your machine. Please see the docker installation instructions for platform specific information.


Authorize your application using courseraoauth2client:

courseraoauth2client config authorize --app manage_research_exports

To use the containers functionality, a docker instance must be running. Please see the docker getting started guide for installation instructions for your platform.

Command Line Interface

The project includes a command line tool. Run:

courseraresearchexports -h

for a complete list of features, flags, and documentation. Similarly, documentation for the subcommands listed below is also available (e.g. for jobs) by running:

courseraresearchexports jobs -h


Submit a research export request or retrieve the status of pending and completed export jobs.


Creates an data export job request and return the export request id. To create a data export requests for all available tables for a course:

courseraresearchexports jobs request tables --course_slug $COURSE_SLUG \
    --purpose "testing data export"

Replace $COURSE_SLUG with your course slug (The course slug is the part after /learn in the url. For, the slug is machine-learning).

If a more limited set of data is required, you can specify which schemas are included with the export. (e.g. for the demographics tables):

courseraresearchexports jobs request tables --course_slug $COURSE_SLUG \
    --schemas demographics --purpose "testing data export"

For more information on the available tables/schemas, please refer to the Coursera Data Exports Guide.

If you are a data coordinator, you can request that user ids are linked between domains of the data export:

courseraresearchexports jobs request tables --course_slug $COURSE_SLUG \
    --purpose "testing data export" --user_id_hashing linked

Data coordinators can also request clickstream exports:

courseraresearchexports jobs request clickstream --course_slug $COURSE_SLUG \
    --interval 2016-09-01 2016-09-02 --purpose "testing data export"


Lists the details and status of all data export requests that you have made:

courseraresearchexports jobs get_all


Retrieve the details and status of an export request:

courseraresearchexports jobs get $EXPORT_REQUEST_ID


Download an completed request tables export to your local destination:

courseraresearchexports jobs download $EXPORT_REQUEST_ID



Creates a docker container using the postgres image and loads export data into a postgres database on the container. To create a docker container from an export, first request an export using the jobs command. Then, using the $EXPORT_REQUEST_ID, create a docker container with:

courseraresearchexports containers create --export_request_id $EXPORT_REQUEST_ID

This will download the data export and load all the data into the database running on the container. This may take some time depending on the size of your export. To create a docker container with an already downloaded export (please decompress the archive first):

courseraresearchexports containers create --export_data_folder /path/to/data_export/

After creation use the list command to check the status of the container and view the container name, database name, address and port to connect to the database. Using docker, you can connect to the database by running:

docker run -it --rm --link $CONTAINER_NAME postgres:9.5 psql -h $CONTAINER_NAME -d $DATABASE_NAME -U postgres

If you have psql installed, you can connect with:

psql -p $HOST_PORT -h $HOST_IP -d $DATABASE_NAME -U postgres

with the parameters provided by list.


Lists the details of all the containers created by courseraresearchexports:

courserareserachexports containers list


Start a container:

courseraresearchexports containers start $CONTAINER_NAME_OR_ID


Stop a container:

courseraresearchexports containers stop $CONTAINER_NAME_OR_ID


Remove a container:

courseraresearchexports containers remove $CONTAINER_NAME_OR_ID

Bugs / Issues / Feature Requests

Please us the github issue tracker to document any bugs or other issues you encounter while using this tool.

Developing / Contributing

We recommend developing courseraresearchexports within a python virtualenv. To get your environment set up properly, do the following:

virtualenv venv
source venv/bin/activate
python develop
pip install -r test_requirements.txt


To run tests, simply run: nosetests, or tox.

Code Style

Code should conform to pep8 style requirements. To check, simply run:

pep8 courseraresearchexports tests
Release History

Release History

History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


History Node


This version
History Node


Download Files

Download Files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

File Name & Checksum SHA256 Checksum Help Version File Type Upload Date
courseraresearchexports-0.0.1.tar.gz (9.8 kB) Copy SHA256 Checksum SHA256 Source Sep 6, 2016

Supported By

WebFaction WebFaction Technical Writing Elastic Elastic Search Pingdom Pingdom Monitoring Dyn Dyn DNS Sentry Sentry Error Logging CloudAMQP CloudAMQP RabbitMQ Heroku Heroku PaaS Kabu Creative Kabu Creative UX & Design Fastly Fastly CDN DigiCert DigiCert EV Certificate Rackspace Rackspace Cloud Servers DreamHost DreamHost Log Hosting