Scalable, asynchronious SalesForce (SFDC) report downloader for SSO/SAML clients
Project description
SFrout - Sales Force Report Downloader
What is it?
SFrout is a scalable, asynchronous SalesForce report downloader for SAML/SSO clients. The app allows you to download reports based on their ID using your personal SFDC account. Supports asynchronous requests, threaded processing of the files, logging to rotating file and stdout, produces summary report for the session.
Installation
SFrout require Python 3.11 to work properly.
- navigate to some convenient folder (optional)
cd c:/path/to/venv
- create Python virtual env (optional)
python -m venv c:/path/to/venv
- activate virtual env
for Windows:
c:\path\to\venv\Scripts\activate.bat
for Unix
source /path/to/venv/bin/activate
- install sfrout from PyPi
pip install sfrout
Usage
Python interface
- create a python file and paste in below script. Amend domain and path to your reports file
import sfrout
sfrout.run(domain="https://corp.my.salesforce.com/",
reports_path="C:\\path\\to\\reports.csv")
- fill in reports.csv file according to given template
- execute the script
"c:/path/to/your/file.py"
-
shortly after you might be prompted to log in to SalesForce in MS Edge, website will open automatically
-
next the progress bar will show up
- once finish, SFrout will print summary table
CLI
- open terminal
cmd.exe
- type in command
sfrout "https://corp.my.salesforce.com/" "C:\\path\\to\\reports.csv"
- fill in reports.csv file according to given template
CLI interface allow you to configure parameters according to below list:
Options:
-s, --summary_filepath PATH Path to the summary report ->
c:/summary_report.csv
-r, --report TEXT Run single report ->
"name,id,path,optional_report_params"
-p, --path PATH Override save location of the reports
-t, --threads INTEGER Number of threads to spawn [default: 0]
-ls, --stdout_loglevel TEXT STDOUT logging level -> [DEBUG | INFO | WARN
|WARNING | ERROR | CRITICAL] [default:
WARNING]
-lf, --file_loglevel TEXT File logging level -> [DEBUG | INFO | WARN|
WARNING | ERROR | CRITICAL] [default: INFO]
-v, --verbose Turn off progress bar
-h, --help Show this message and exit.
Windows Task Scheduler
Create sfrout.bat
, save it with below script:
"c:\path\to\new\virtual\environment\Scripts\python.exe" sfrout "https://corp.my.salesforce.com/" "C:\\path\\to\\reports.csv"
pause
Test it, if works create a task and set some schedule
How the program works
Once you run the program:
- configuration, data parsing
- creating connector, report, shared queue objects
- initialization of workers listeners within file handler
- connector will check the connection and execute all required steps to establish the connection
- connector will produce asynchronous requests to given domain
- once single request is fulfilled retrieved content is being put into the queue
- workers keep on listen for items in the queue
- once queue is not empty some worker will take the item and start processing
- once all the request are fulfilled queue will close and send signals to workers to shutdown once they finish their last job
SFDC Connector
Authentication:
Authorization and authentication in SFDC is based on sid
cookie entry for SSO or on security tokens in other cases.
SFrout will try to connect to CookieJar of your MS Edge browser and find sid
entry for given domain. If sid
is not found, the app will open MS Edge and request given SFDC domain. You will be asked to log in as usual. After 20 seconds program will retry to find sid
in your CookieJar. Browsers usually store cookies in SQlite db. This information is not being transfered to db immediately, it can be triggered by closing the browser but it isn't the most elegant solution. SFrout will ask for sid
in 2 seconds intervals as long as sid
will be available. Entire process will repeat as many times as it takes.
Sending requests:
SFDC supports export GET requests by adding ?export=csv&enc=UTF-8&isdtp=p1
to address of your supplemented with headers and above sid
entry. In response you will receive CSV-like data stream. Time window for entire operation is fixed and equal to 15 minutes. If you will not be able to receive response in this narrow time window connection will be forceable shutdown and request cancelled regardless of the stage.
Requests are send out asynchronously to speed things up and restrain memory consumption to bare minimum. Once request will fail, regardless of that what has caused failure, SFrout will retry. Limit of attempts has been set to 20. Once request is successful, response is saved in Report object and put to the queue for further processing.
File handler
Thread based solution for saving request responses to a file.
File handler spawns workers in separate threads. Number of workers is equal to half of available threads on your machine (e.g. if your cpu has 6 cores and 12 threads SFR will spawn 6 workers). If information about available resources is not reachable it will default to 2. Such approach will not dramatically slow down other applications on your computer and will secure required resources for SFR. Each worker will observe Queue
, if something will be put into Queue
one of the workers will start processing of the report. Bare in mind that each saving operation erases response and content of the report due to memory consumption. Queue
size is unlimited so sooner or later workers will handle entire workload. Workers will die once Queue
will send signal that they shouldn't expect any new items. These workers who are just processing items will finish their jobs and die quietly.
All files are processed by Pandas
which gives wide palette of available formats. Unfortunately such flexibility somes with the price. In current shape Pandas
isn't the best in saving csv
files due to numpy
engine. On top of that Pandas
is a relatively heavy library. I plan to switch to some other processing engine in the future.
Limitations
-
Caution! SFrout deletes last 5 lines from each response, SFDC adds footer to each data stream. This might be organization specific and require your attention if you plan to use it in your organization.
-
be default number of workers is equal to half of available threads of the machine
-
by default rotating file log is limitted to 3 parts, up to 1_000_000 bytes each
-
progress bar is based on quantity of items and may show incorrect ETA if report's size vary significantly
-
currently only save to
csv
method is available
Benchmarks
My testing set consist of 33 reports from various universes of SFDC with size between 200 kb to 200 mb. In total 1.4 gb of data.
Tests were not bounded by network bandwidth. Tests were evaluated on i7-8850H, DDR4 32 gb, Windows 10 x64.
Processing of the testing set vary between 3 and 8 minutes, results strongly correlate to SFDC performance on given time. Time of processing is correlated to size of the reports, bigger = longer.
Final remarks
This app has been created based on environment of my organization. There is alternative way of Authenticating to SFDC based on security token, unfortunately this option was blocked in my organization and only SSO is available.
Documentation available on Read the Docs
Release Notes
License
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file sfrout-0.0.27.tar.gz
.
File metadata
- Download URL: sfrout-0.0.27.tar.gz
- Upload date:
- Size: 109.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 604c80e1f4bdd936e43863aef0a8674f5dfbd9bae81b5986a367abcc99c38b1e |
|
MD5 | ab0d0e68bb0b12d14945f47f43eebaca |
|
BLAKE2b-256 | dbaebaf13cb4b6f88cf13700bc0b04d6c54b9ff3f1bff313919bdd0c3620311e |
File details
Details for the file sfrout-0.0.27-py3-none-any.whl
.
File metadata
- Download URL: sfrout-0.0.27-py3-none-any.whl
- Upload date:
- Size: 23.9 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.2
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 93146b8a8d8b7137ad959e18c90b9f01a7edb4a006e13e4990fecec3f5087854 |
|
MD5 | f8ecf25e802eddac3d03cd6ed71524ac |
|
BLAKE2b-256 | cebe7a7a900baf71a8fb7a4b14d259c4b39b77cfcced4c215c820b561dd6899e |