Hive support for Feast offline store
Project description
Feast Hive Support
Hive is not included in current Feast roadmap, this project intends to add Hive support for Offline Store.
For more details, can check this Feast issue.
Important
The first stable version (v0.1.0) has been published, please create an issue if you have met any problem.
Todo & Progress
- [DONE]
I am working on the first workable version, think it will be released in a couple of days. - It only supports
insert into
for uploading entity_df for now, which will be a little inefficient. In next version, I will provide some extra parameters for users who are able to provide WebHDFS address.
Quickstart
Install feast
pip install feast
Install feast-hive
- Install stable version
pip install feast-hive
- Install develop version (not stable):
pip install git+https://github.com/baineng/feast-hive.git
Create a feature repository
feast init feature_repo
cd feature_repo
Edit feature_store.yaml
set offline_store
type to be feast_hive.HiveOfflineStore
project: ...
registry: ...
provider: local
offline_store:
type: feast_hive.HiveOfflineStore
host: localhost
port: 10000 # default is `10000`
database: default # default is `default`
... # other parameters
online_store:
...
Create Hive Table
- Upload
data/driver_stats.parquet
to HDFS
hdfs dfs -copyFromLocal ./data/driver_stats.parquet /tmp/
- Create Hive Table
CREATE TABLE driver_stats (
event_timestamp bigint,
driver_id bigint,
conv_rate float,
acc_rate float,
avg_daily_trips int,
created bigint
)
STORED AS PARQUET;
- Load data into the table
LOAD DATA INPATH '/tmp/driver_stats.parquet' INTO TABLE driver_stats;
Edit example.py
# This is an example feature definition file
from google.protobuf.duration_pb2 import Duration
from feast import Entity, Feature, FeatureView, ValueType
from feast_hive import HiveSource
# Read data from Hive table
# Here we use a Query to reuse the original parquet data,
# but you can replace to your own Table or Query.
driver_hourly_stats = HiveSource(
# table='driver_stats',
query = """
SELECT from_unixtime(cast(event_timestamp / 1000000 as bigint)) AS event_timestamp,
driver_id, conv_rate, acc_rate, avg_daily_trips,
from_unixtime(cast(created / 1000000 as bigint)) AS created
FROM driver_stats
""",
event_timestamp_column="event_timestamp",
created_timestamp_column="created",
)
# Define an entity for the driver.
driver = Entity(name="driver_id", value_type=ValueType.INT64, description="driver id", )
# Define FeatureView
driver_hourly_stats_view = FeatureView(
name="driver_hourly_stats",
entities=["driver_id"],
ttl=Duration(seconds=86400 * 1),
features=[
Feature(name="conv_rate", dtype=ValueType.FLOAT),
Feature(name="acc_rate", dtype=ValueType.FLOAT),
Feature(name="avg_daily_trips", dtype=ValueType.INT64),
],
online=True,
input=driver_hourly_stats,
tags={},
)
Apply the feature definitions
feast apply
Generating training data and so on
The rest are as same as Feast Quickstart
Developing and Testing
Developing
git clone https://github.com/baineng/feast-hive.git
cd feast-hive
# creating virtual env ...
pip install -e .[dev]
# before commit
make format
makr lint
Testing
pip install -e .[test]
pytest -n 6 --host=localhost --port=10000 --database=default
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
feast-hive-0.1.1.tar.gz
(22.0 kB
view hashes)
Built Distribution
feast_hive-0.1.1-py3-none-any.whl
(20.2 kB
view hashes)
Close
Hashes for feast_hive-0.1.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 9d4be25ebac030ab1a6f5792370138561becfb18f81ee6cfb393056637902fe9 |
|
MD5 | aea7e5a7c4c738ba0304534b4743831e |
|
BLAKE2b-256 | 76865866f65857f541aa9d48a7ea8c353db3d49ce22300e0073321a59e9d0989 |