SDK for Logos platform
Project description
Logos Software Development Kit
Logos Software Development Kit (Logos SDK) is a private library encapsulating functionality shared by Logos Scripts - configurable scripts running as Cloud Functions, triggered by Cloud Scheduler, which are controlled and monitored via the Logos UI, as a part of the Logos Ecosystem.
Functionality modules
Logging
Cloud Function Scripts within the Logos ecosystem have special requirements for logging. Apart from simple JSON messages, the logs need to also carry information on:
- trace (used for tracking runs of Cloud Functions)
- labels (fields used for internal Logos filtering)
- logger name
The standard Python logging
library does not offer the possibility of capturing a combination of these values in a format parsable by Google Cloud Logging - this is only possible through the google-cloud-logging
library. On the other hand, google-cloud-logging
transmits all logs to Google Cloud Logging even when a script is run on the local machine (with the exception of running it within the functions-framework
environment), which is something to be avoided when developing locally or testing on Bitbucket, as we do not want to waste our resources.
This module serves as a wrapper over the standard Python logging
and google-cloud-logging
libraries, with the default being set to google-cloud-logging
. The switch is based on the presence of environment variables DEVELOPMENT
or TESTING
: if either of these is set within the venv environment, logging
library is used instead.
If deploying to development instance in Cloud, we want the logs to be logged into Cloud, but not to clash with the production instance, therefore setting the environment variable CLOUD_DEVELOPMENT
ensures that the logs are logged under name logos-logging-development
instead of the standard logos-logging
.
export DEVELOPMENT=True
#or
export TESTING=True
#or
export CLOUD_DEVELOPMENT=True
Usage
To set up the Logger, one can use the parsing function setup_from_request
, which expects a Cloud Function Script trigger request in a usual format:
import functions_framework
import logos_sdk.logging as logos_logging
@functions_framework.http
def run(request):
logger, labels, settings, secrets = logos_logging.setup_from_request(request, "logos-logging")
EXPECTED_REQUEST_JSON_BODY = {
"settings": {},
"id": "",
"author": "",
"script": "",
"client": "",
"accesses": [
{
"account": {"id": "", "account_platform_id": ""},
"secret": {"id": "", "name": ""},
"platform": {"id": "", "short_name": ""}
},
...
],
}
The trigger request can be also processed separately, in which case the logger is instantiated from the class itself:
LogosLogger(name="logos-logging", labels={}, trace="")`
Output
When the standard Python logging
library is used, the output logs have the following structure:
Lowest possible severity: INFO
String output: SEVERITY:logger-name:message
LogEntry output:
LogEntry.name = "logos-logging"
LogEntry.msg = {"message": ""}
LogEntry.level = 40 | ...
LogEntry.levelname = INFO | ERROR | ...
LogEntry.json_fields = {
"logging.googleapis.com/trace": trace,
"logging.googleapis.com/labels": {
"log_type": RESULT | NOTIFICATION | DEBUG,
**labels,
},
}
With google-cloud-logging
, the output logs should have the following structure:
Lowest possible severity: INFO
{
"logName": "projects/logos-382010/logs/logos-logging",
"trace": "projects/logos-382010/traces/some-trace",
"severity": INFO | ERROR | ...,
"jsonPayload": { "message": "", ... },
"labels": {
"log_type": RESULT | NOTIFICATION | DEBUG,
"id": "Logos id of the job",
"author": "Logos id of the job author",
"client": "Logos id of the client",
"script": "Logos id of the script",
**platform_accounts,
}
}
Depending on the platform_accounts
the Cloud Function Script accesses during its run, labels
might also contain Logos platform short_name
field as a key and Logo account_platform_id
as a value, for example:
{
"merchant-center": "xxxxxxx",
"google-ads": "xxx-xxx-xxxx",
}
To sum it up, an example of the complete labels contents would be:
{
"labels": {
"log_type": "result",
"id": "0",
"author": "0",
"client": "0",
"script": "0",
"merchant-center": "0000000",
"google-ads": "000-000-0000"
}
}
Services
This module serves as a wrapper over communicating with Logos Services. When DEVELOPMENT
, TESTING
or CLOUD_DEVELOPMENT
environment variable is set
export DEVELOPMENT=True
#or
export TESTING=True
#or
export CLOUD_DEVELOPMENT=True
the URLs of development services (bound to the development branches in BitBucket) are used. On the other hand, if none of these is set, production URLs are called (used in the Cloud production environment).
Installation as a dependency to Logos Scripts
Local environment
This library is hosted as on Pypi.
logos-sdk=[desired version]/latest
2. Then in your terminal with the venv active, run following:
```bash
pip3 install -r requirements.txt
export DEVELOPMENT=True
To verify that the library was successfully installed, you can view the installed package information via:
pip3 show -f logos-sdk
BitBucket pipelines environment
Firstly, read the local environment setting. If "been there, done that", the setting for BitBucket pipeline testing is very similar. At first, go to pipeline variables in BitBucket UI and create a variable GOOGLE_CREDENTIALS
, pasting the contents of the logos-accessor
service account credentials file into it. Then create PYPI_CREDENTIALS
and paste this into the secret:
You also need set the SERVICES_PATH
for each service.
[pypi]
username = __token__
password = pypi-[api token]
Then your bitbucket-pipelines.yml testing step should look like this:
pipelines:
default:
- step:
name: Test
caches:
- pip
script:
- echo $GOOGLE_CREDENTIALS > logos-382010-ee0fd6995649.json
- echo $PIPY_CREDENTIALS > .pypirc
- export GOOGLE_APPLICATION_CREDENTIALS=logos-382010-ee0fd6995649.json
- pip3 install -U setuptools wheel twine
- python setup.py sdist bdist_wheel
- twine upload --config-file .pypirc --verbose dist/*
Google Cloud environment
Firstly, read the local environment setting. If "been there, done that", your Dockerfile should contain the following steps:
RUN apt-get update && apt-get install -y git
RUN pip3 install google-auth keyring keyrings.google-artifactregistry-auth
RUN pip3 install --no-cache-dir -r requirements.txt
In your cloudbuild.yaml file, the build step should contain the --network=cloudbuild
parameter, as this ensures that the keyring
auth libraries have access to the necessary credentials directly from the Cloud Build environment (we no longer need to set GOOGLE_APPLICATION_CREDENTIALS
environment variable):
- name: 'gcr.io/cloud-builders/docker'
args: [ 'build', '--network=cloudbuild', '-t', 'gcr.io/logos-382010/merchant-control', '.' ]
If you are deploying a development cloud instance, in Cloud Run settings, CLOUD_DEVELOPMENT
needs to be set:
export CLOUD_DEVELOPMENT=True
Development & versioning
Development of new features/refactor/debug follow the naming convention of:
"feature/[short name]"
"refactor/[short name]"
"debug/[short name]"
Pull requests are directed do the current development branch, which always bears the number of the newest version, for example development-0.0.2
. After adding a major feature, or a number of less significant refactors and hot-fixes, the current development branch is merged into master and deployed into Google Artifact Registry. The old development branch is then deleted and a new branch with the following version development-0.0.3
will be created, and the first commit must include changing of the version in the setup.py
file.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distributions
Built Distributions
File details
Details for the file logos_sdk-0.0.24-3-py3-none-any.whl
.
File metadata
- Download URL: logos_sdk-0.0.24-3-py3-none-any.whl
- Upload date:
- Size: 29.8 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.11.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
e985058e670bfb5c58b3cc92b80b543fbb7f7846fb6b705112f65fd437479625
|
|
MD5 |
0c5c1db112d0266310a716671fe0d8c2
|
|
BLAKE2b-256 |
118628fe6a247d26f1675c614a1d7ac020b4c4728635de3ef546ea33827daefb
|
File details
Details for the file logos_sdk-0.0.24-2-py3-none-any.whl
.
File metadata
- Download URL: logos_sdk-0.0.24-2-py3-none-any.whl
- Upload date:
- Size: 29.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.9.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
a92cb359b628cbf9f873efb403cd958d7bfe321f431820aa3389bd0a3b3cf015
|
|
MD5 |
2419bb2a6f95f334506ef988228728c8
|
|
BLAKE2b-256 |
27c5c7175c4d6728cdabc474710ecc54813b8cca5a98c20eedcbd6a4422125ab
|
File details
Details for the file logos_sdk-0.0.24-1-py3-none-any.whl
.
File metadata
- Download URL: logos_sdk-0.0.24-1-py3-none-any.whl
- Upload date:
- Size: 29.7 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.9.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 |
cc47f1db84b091ddc8fad4ac1f4099f033f69ef3e658f02997c452eee5c60218
|
|
MD5 |
c321196ac16cddd499b9e3b4ab9c4c17
|
|
BLAKE2b-256 |
042dfa2c41b40b7d4da83556bda633a087b10e1dd3a08beec1a836c1a6815f27
|