Skip to main content

Scrapes attribution data from GA through JS Network in Python for CSV exports.

Project description

ga-attribution-scrape

Scrapes attribution data from GA through JS Network in Python for CSV exports.

Notes
  • The program assumed separate conversions and does not currently try to sum conversions together from separate conversion IDs.
  • When dealing with GA goals, will pull all goals as separate requests.
  • Works on a Service Account for authentication.
How to run

First import the Scrape function:

from ga_attribution_scrape import Scrape

Then initialise ga_attribution_scrape with the Scrape() function which must contain a config dictionary, which can be found at https://github.com/lewisaustinbryan/ga-attribution-scrape/blob/main/empty_config.yaml

config

Service Account

You have to create a service account in Google Cloud Platform that has Bigquery access and GA access if you want to use a goal as a kpi for attribution reports. Help on creating one can be found here. https://cloud.google.com/iam/docs/creating-managing-service-accounts Separate Service accounts can be created for GA and Bigquery

There are four main parts to the configuration:

GA

Here you need to include account ID, Property ID and view ID.

Bigquery

For including the dataset ID and Table ID to tell Bigquery where to put the attribution reports.

Backdate

If backdate is True then will just pull yesterdays data, otherwise it will loop through each day on the specified start_date and end_date

Unless you explicitly set GOOGLE_APPLICATION_CREDENTIALS in the environment (e.g. using os module), be aware that the program expects you to backdate first with service account, then when backdate is False it refreshes the service used. There is no other option but it makes it very easy to put into a Cloud/Gamma Function.

Request

This is where we copy the request from the JS network in Google analytics in the "Conversions -> Multi Channel Funnels -> Model Comparison Tool" report for the request url https://analytics.google.com/analytics/web/exportReport/, which will have various query parameters associated with it.

Copy everything from Request Headers and and Form Data, which is included in the empty_config.

Get Attribution Report

`# Import get_ga_goals: from ga_attribution_scrape import get_ga_goals

Get the goals from GA management API and create a pandas dataframe:

ga_goal_management_data = get_ga_goals(accountId, propertyId, viewId, credentials) # where credentials is service account #Now Scrape the GA attribution report for each goal according to configuration: Scrape(config).goals(ga_goal_management_data)`

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ga-attribution-scrape-0.0.9.tar.gz (4.3 kB view details)

Uploaded Source

Built Distribution

ga_attribution_scrape-0.0.9-py3-none-any.whl (4.8 kB view details)

Uploaded Python 3

File details

Details for the file ga-attribution-scrape-0.0.9.tar.gz.

File metadata

  • Download URL: ga-attribution-scrape-0.0.9.tar.gz
  • Upload date:
  • Size: 4.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.4

File hashes

Hashes for ga-attribution-scrape-0.0.9.tar.gz
Algorithm Hash digest
SHA256 6b8902026a5430b37fa3d2fabc33c53644e69a02112353b60cefd7954ca87700
MD5 231fa8fb2a87afba6b8f69145e3898dc
BLAKE2b-256 89648275b749a420a6682cc722c0e2db4768fc13652cadcde8505447f5c8e3c8

See more details on using hashes here.

File details

Details for the file ga_attribution_scrape-0.0.9-py3-none-any.whl.

File metadata

  • Download URL: ga_attribution_scrape-0.0.9-py3-none-any.whl
  • Upload date:
  • Size: 4.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.4

File hashes

Hashes for ga_attribution_scrape-0.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 4b971688beb1795c230e20189cf4a5f8e9ced7a713f907835ad431b5a789ef2d
MD5 0ffde0ee78ee534a15c1487c5924b640
BLAKE2b-256 28caceb1b2fb6bd396db20f4783bef73a530ffe3d9e7178755f996012987e9dd

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page