Skip to main content

No project description provided

Project description

Overview

The Snowplow Signals SDK is the interface between the Feature Store (Personalisation API), the frontend JS plugin and the customer's own agentic implementation. It will extract features from the online store, and eventually handle the creation/update of features.

JS plugin Integration

The first step is to integrate the SDK with the already prototyped JS plugin. The JS sets a `sp_signals' cookie, key value pairs as below:

{
    session_count: 3,
    last_page_visited: acme.com/about,
    last_product_viewed: "Nike Red Shoes",
    first_product_viewed: "Adidas Black Shoes",
    last_product_added_to_cart: "Green Socks"
}

The Python sdk should have a mechanism to accept a cookie in this format, and insert it into a "feature" string. eg

features_from_cookie = SignalsAI.get_features_from_cookie(req.cookies)
last_visited_page = features_from_cookie.get_feature("last_page_visited")

last_visited_page_prompt = f"The last page the visitor visited was {last_visited_page}"
# The last page the visitor visited was acme.com/about

Feature Retrieval

The SDK should be able to retrieve a single from teh online feature store.

An example of the Feast Python SDK is below. Pass a list of dictionarys to define which entities to retrieve, and a list of the feature view/names. This can be returned as a dictionary to access in app.

# Initialize the feature store
store = FeatureStore(repo_path="path_to_your_repo")

# Specify the entity and features you want to retrieve
entity_rows = [{"entity_id": 1001}]
features = ["feature_view:feature_name"]

# Retrieve the features
feature_data = store.get_online_features(features=features, entity_rows=entity_rows).to_dict()
print(feature_data)

Feature Store Integration

As per the feature store Spike

Release Process

To make a new release, follow these steps:

  1. Prepare the changelog: Create a commit (e.g., "Prepare for release") that updates the CHANGELOG.md with all notable changes for the new version.
  2. Create a release PR: Open a pull request to the main branch with your changelog and any other release-related changes.
  3. Merge the PR: Merge the release PR using a merge commit. Do not use squash or rebase.
  4. Run the Release workflow: Trigger the "Release" workflow in GitHub Actions to publish the new version to PyPI.

This process ensures a clear release history and proper automation of package publishing.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

snowplow_signals-0.0.3.tar.gz (16.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

snowplow_signals-0.0.3-py3-none-any.whl (21.0 kB view details)

Uploaded Python 3

File details

Details for the file snowplow_signals-0.0.3.tar.gz.

File metadata

  • Download URL: snowplow_signals-0.0.3.tar.gz
  • Upload date:
  • Size: 16.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.2 CPython/3.11.11 Linux/6.8.0-1021-azure

File hashes

Hashes for snowplow_signals-0.0.3.tar.gz
Algorithm Hash digest
SHA256 adc5e4e81cf6ba2a127e965dbe9a11b2fc8e56e9cb45ba725ccb4d6e82235466
MD5 880cb0f3204de28a0a49d37f4268f481
BLAKE2b-256 93b758d9b985f58bd3e8a7564e0e05530acb4af2dc77322618a1fc15ad8206e3

See more details on using hashes here.

File details

Details for the file snowplow_signals-0.0.3-py3-none-any.whl.

File metadata

  • Download URL: snowplow_signals-0.0.3-py3-none-any.whl
  • Upload date:
  • Size: 21.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.2 CPython/3.11.11 Linux/6.8.0-1021-azure

File hashes

Hashes for snowplow_signals-0.0.3-py3-none-any.whl
Algorithm Hash digest
SHA256 73e29c1c5a1ada3326d3200f06058104c9bdb9d78a98346a754d40bb95306eb5
MD5 a28d059ae689f2672567dcb500093571
BLAKE2b-256 715b3a4c722b27843c00b39df6834008c9448f408e3670fd352ef7fe202ec559

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page