Skip to main content

a simple preview for dsp digital advertising information

Project description

Tool for merging DSP data from many providers in a single view. There are CLI tools for launching the workers responsible for parsing .csv files and storing them in a MySQL database. There is also a web app where it is possible to have a complete report of the operation.

Usage

You must specify the following environment variables prior to usage:

  • DB_HOST the url or ip for the mysql server

  • DB_PORT the port for the mysql server

  • DB_NAME the name of the database

  • DB_USER a user with writing permission

  • DB_PASS the user’s password

If you inted to develop or change something, it is also need:

  • DB_TEST_NAME the name of the database (for testing purposes)

  • DB_TEST_USER a user with writing permission (for testing purposes)

  • DB_TEST_PASS the user’s password (for testing purposes)

Since this project has support only for GCP (currently), the following environment variables are also mandatory:

  • GOOGLE_APPLICATION_CREDENTIALS the json file for an account with admin permissions for the Storage service.

  • GCP_BUCKET the bucket where the .csv file will be placed

  • GCP_BUCKET_ARCHIVE the bucket where the .csv file will be archive after processed.

A much better option would be to set all these variables in a file named .dspreview.csg in the users home folder:

{
    "GOOGLE_APPLICATION_CREDENTIALS": "/home/user/service_account.json",
    "GCP_BUCKET": "...",
    "DB_HOST": "...",
    "DB_PORT": "3306",
    "DB_NAME": "...",
    "DB_USER": "...",
    "DB_PASS": "..."
}

If the above environment variables are set, you can initialize the system. It will create the database, tables, and so on. It might be donne through:

$ dspreview init

There are currently two workers: dcm and dsp. The dcm worker expects to find a file named dcm.csv inside the GCP_BUCKET, with the following structure:

[date, campaign_id, campaign, placement_id, placement, impressions, clicks, reach]

where:

  • date should be in format YYYY-MM-DD

  • campaign_id is an integer

  • campaign is a string with the information brand_subbrand

  • placement_id is an integer

  • placement is a string with the information dsp_adtype

  • impressions is an integer

  • clicks is an integer

  • reach is an float, please take care to not repeat this, since it is a calculated metric

While the dsp worker expect to find a file with the dsp’s name (like dbm.csv or mediamath.csv) and the following structure:

[date, campaign_id, campaign, impressions, clicks, cost]

where:

  • date should be in format YYYY-MM-DD

  • campaign_id is an integer

  • campaign is a string with the information brand_subbrand_adtype

  • impressions is an integer

  • clicks is an integer

  • cost is a float

In order to launch a worker, you might use the command:

$ dspreview --worker dcm

or:

$ dspreview --worker dsp

If the DSP is known in advance, you might run:

$ dspreview --worker dsp --dsp dbm

or

$ dspreview --worker dsp --dsp mediamath

When all files are stored in the MySQL database, the following command generates the full report:

$ dspreview --generate-report

The web app might be run through:

$ dspreview serve --port 80

The default port is 80

Preparing for Development

  1. Ensure pip and pipenv are installed.

  2. Make sure you also have default-libmysqlclient-dev or libmysqlclient-dev installed.

  3. Clone repository: https://github.com/thiagolcmelo/dspreview

  4. Fetch development dependencies: make install

Running Tests

Run tests locally using make if virtualenv is active:

$ make

If virtualenv isn’t active then use

$ pipenv run make

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dspreview-0.1.3.tar.gz (1.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

dspreview-0.1.3-py2.py3-none-any.whl (1.1 MB view details)

Uploaded Python 2Python 3

File details

Details for the file dspreview-0.1.3.tar.gz.

File metadata

  • Download URL: dspreview-0.1.3.tar.gz
  • Upload date:
  • Size: 1.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.2.0 requests-toolbelt/0.8.0 tqdm/4.25.0 CPython/3.6.3

File hashes

Hashes for dspreview-0.1.3.tar.gz
Algorithm Hash digest
SHA256 6c3835804cfc9101fb37117c9ed14a6262a2533bff98f33ef1c8bb4cb1d21041
MD5 829c11320dd608e8a63edbd5d326e9b2
BLAKE2b-256 4a6939acbf9001c1766626093df269d7f1127d00a5edaf1709925a2728507e8b

See more details on using hashes here.

File details

Details for the file dspreview-0.1.3-py2.py3-none-any.whl.

File metadata

  • Download URL: dspreview-0.1.3-py2.py3-none-any.whl
  • Upload date:
  • Size: 1.1 MB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.11.0 pkginfo/1.4.2 requests/2.19.1 setuptools/40.2.0 requests-toolbelt/0.8.0 tqdm/4.25.0 CPython/3.6.3

File hashes

Hashes for dspreview-0.1.3-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 ecadf917a143a8b30a020ffd06e7571c7746b836c1969b6c42c5bd6f2acb3f35
MD5 67844c1c6d731cf689133721eea32805
BLAKE2b-256 f202d83c3929317670a5b4fc84a9fda5c191cd83a9119dd8957accab1f0f60d6

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page