Skip to main content

A utility to generate ML features from yaml

Project description

feature store utils

A light-weight package that allows you express ML features in simple yaml, build a training data set and then write them to a feature store.

some general thoughts on building a training dataset

https://docs.google.com/presentation/d/1tVkrwCLVwFp8cZC7CmAHSNFhsJrcTdC20MlZfptkSBE/edit?usp=sharing

options for use

  1. clone this repo. create features.yaml. follow demo notebook. do not check back in.
  2. create you own repo and install as package (currently in testpypi). See https://github.com/BenMacKenzie/churn_model_demo as an example. Note that you must create a .env file in folder which contains the features.yaml file

Notes

  1. Current version is experimental. Not clear that Jinja is the right way to write parameterized SQL. Might be better to do in Python.
  2. Current version is not optimized. Each feature is calculated individually, whereas if table, filters and time windows are identical, multiple aggregation features can be calculated simultaneously.
  3. I believe there are around a dozen standard feature types. The most common have been implemented. Note that views can fill in a lot of gaps if encountered. missing:
  • type 1 lookup.
  • 1st order aggregations over time series (e.g., just treat it like a fact table)
  • 2nd order aggregations over time series e.g., max monthly job dbu over 6 month window.
  • time in state, e.g., how long was a ticket open. based on a type 2 table.
  • time to event in fact table, e.g., time since last call to customer support
  • scalar functions of two or more features, e.g, time in days between two date
  • num state changes over interval (rare)
  • functions of features (e.g., ratio of growth in job dbu to interactive dbu). Arguably this is not needed for boosted trees. Might be useful for neural nets...but why use a nueral net on heterogeneous data? (actually this kind of thing can be good for model explainability)
  1. Need to illustrate adding features from a related dimension table (using a foreign key...machinery is in place to do so.)
  2. Current version illustrates creating a pipeline which uses the api. But it would be nice just to generate the code and write it to a notebook so that the package is invisible in production (like bamboolib)
  3. The demo repo (https://github.com/BenMacKenzie/churn_model_demo) illustrates 'hyper-features' which are features with variable parameters.
  4. Connecting 'hyper-features' to feature store needs to be worked out. Currently the option is to add all of them or specify individual version by their (generated) name
  5. Fix feature store feature gen observation dates. Align with grain of feature, e.g., if grain is monthly make sure feature store contains an observation on first of month.

Building

python3 -m build  
python3 -m twine upload --repository testpypi dist/*

python3 -m twine upload dist/*

Running unit tests on databricks

  1. install the databricks extension for vscode
  2. use this repo as a template. Note the following:
  3. remote_test_harness/pytest_databricks.py
  4. .vscode/launch.json
  5. write tests as usual (see tests/time_series/time_series_test.py as an example)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

feature_store_utils-0.0.4.tar.gz (11.1 kB view details)

Uploaded Source

Built Distribution

feature_store_utils-0.0.4-py3-none-any.whl (13.6 kB view details)

Uploaded Python 3

File details

Details for the file feature_store_utils-0.0.4.tar.gz.

File metadata

  • Download URL: feature_store_utils-0.0.4.tar.gz
  • Upload date:
  • Size: 11.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.2

File hashes

Hashes for feature_store_utils-0.0.4.tar.gz
Algorithm Hash digest
SHA256 fc538daa207d9e45de3d8afa20c0dbcdfefe1a5147139d83168ee08236589ee8
MD5 dac93810bc1bb9c409dc4a7c9d176b38
BLAKE2b-256 fe51188dab2a3a22eabafe346604bf80d9655cab79824d262f6b61e11362ce16

See more details on using hashes here.

File details

Details for the file feature_store_utils-0.0.4-py3-none-any.whl.

File metadata

File hashes

Hashes for feature_store_utils-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 d2f0b476539c09d5878169e11729aa535df515ee0e4457e2d8f30cef7fa1d4c9
MD5 308a92d140de3e585628b202bf9fec0a
BLAKE2b-256 d5aff580ba63d80ad9417a54ebce45cfd1b64205e02c9aa5e55907742bd07ca0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page