Scrapes attribution data from GA through JS Network in Python for CSV exports.
Project description
ga-attribution-scrape
Scrapes attribution data from GA through JS Network in Python for CSV exports.
Notes
- The program assumes separate conversions and does not currently try to sum conversions together from separate conversion IDs.
- When dealing with GA goals, will pull all goals as separate requests.
- Works on a Service Account for authentication.
How to run
First import the Scrape function:
from ga_attribution_scrape import Scrape
Then initialise ga_attribution_scrape with the Scrape() function which must contain a config
dictionary, which can be found at https://github.com/lewisaustinbryan/ga-attribution-scrape/blob/main/empty_config.yaml
Scrape().Goals(congig)
config
Service Account
You have to create a service account in Google Cloud Platform that has Bigquery access and GA access if you want to use a goal as a kpi for attribution reports. Help on creating one can be found here. https://cloud.google.com/iam/docs/creating-managing-service-accounts Separate Service accounts can be created for GA and Bigquery
There are four main parts to the configuration:
GA
Here you need to include account ID, Property ID and view ID.
Bigquery
For including the dataset ID and Table ID to tell Bigquery where to put the attribution reports.
Backdate
If backdate
is True
then will just pull yesterdays data, otherwise it will loop through each day on the specified start_date
and end_date
Unless you explicitly set GOOGLE_APPLICATION_CREDENTIALS in the environment (e.g. using os
module), be aware that the program expects you to backdate first with service account, then when backdate
is False
it refreshes the service used. There is no other option but it makes it very easy to put into a Cloud/Gamma Function.
Request
This is where we copy the request from the JS network in Google analytics in the "Conversions -> Multi Channel Funnels -> Model Comparison Tool" report for the request url https://analytics.google.com/analytics/web/exportReport/, which will have various query parameters associated with it.
Copy everything from Request Headers and and Form Data, which is included in the empty_config.
Get Attribution Report and send to Bigquery
from from ga_attribution_scrape import Scrape
ga_attribution = Scrape().Goals(config)
ga_attribution.to_bq()
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file ga-attribution-scrape-0.1.0.tar.gz
.
File metadata
- Download URL: ga-attribution-scrape-0.1.0.tar.gz
- Upload date:
- Size: 5.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | e27b68e12c0cda7fc92e9b14d122e181b236530611befb0075cb49ee9298c8b8 |
|
MD5 | 4b76c728c83b8d673f4fa45ace6589b3 |
|
BLAKE2b-256 | 0a10acbfe4f5cf0d842b343c6a33c201ec6650940ac15f516b5e0924506e0ee1 |
File details
Details for the file ga_attribution_scrape-0.1.0-py3-none-any.whl
.
File metadata
- Download URL: ga_attribution_scrape-0.1.0-py3-none-any.whl
- Upload date:
- Size: 6.1 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.51.0 CPython/3.7.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 045fcfcabedacc2ef35691187baadc498f8ee58964463f74e98f5c09ca122451 |
|
MD5 | 9c8477f6c3bffa316ec4fc28ad98ab52 |
|
BLAKE2b-256 | 07fef8c2e0ee8ceff907f8e8d8ac4c58fea69bd05ff0cbaf65564b75a1838639 |