Skip to main content

Scalable time series features computation

Project description

FastTSFeatures

Compute static or temporal time-series features at scale.

Install

pip install fasttsfeatures

How to use

1. Request free trial

Request a free trial sending an email to: fede.garza.ramirez@gmail.com.

2. Required information

To use fasttsfeatures you need:

  • An AWS url provided by Nixtla. You'll upload your dataset here.
  • An user and a password to enter the previous url.
  • An API Key to interact with the scalable API.
  • An API ID to interact with the scalable API.
  • A pair of AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY.

3.1 Case 1: Upload to S3 from python

  • Import the library.
from fasttsfeatures.core import TSFeatures
  • Instantiate TSFeatures introduce your api_id and api_key, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY.
tsfeatures = TSFeatures(api_id=os.environ['API_ID'], 
                        api_key=os.environ['API_KEY'],
                        aws_access_key_id=os.environ['AWS_ACCESS_KEY_ID'], 
                        aws_secret_access_key=os.environ['AWS_SECRET_ACCESS_KEY'])
  • Upload your local file introducing its name and the bucket's name (provided by Nixtla).
s3_uri = tsfeatures.upload_to_s3('../train.csv', 'nixtla-user-test')
  • Run the process introducing the S3 uri.
response = tsfeatures.calculate_features_from_s3_uri(s3_uri=s3_uri,
                                                     freq=7)
display_df(response)
status body id message
0 200 "s3://nixtla-user-test/features/features.csv" f7bdb6dc-dcdb-4d87-87e8-b5428e4c98db Check job status at GET /tsfeatures/jobs/{job_id}
  • Monitor the process with the following code. Once it's done, access to your bucket to download the generated features.
job_id = response['id'].item()
display(tsfeatures.get_status(job_id))
<style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th {
    vertical-align: top;
}

.dataframe thead th {
    text-align: right;
}
</style>
status processing_time_seconds
0 InProgress 3

3.2 Case 2: Upload to S3 Manually

A. Upload your dataset

  • Access the url provided by Nixtla. You'll see a login page like the following. Just enter your user and paswsword.
  • Next you'll see the bucket where you can upload your dataset:
  • Upload your dataset and copy its S3 URI.

B. Run the process

  • Import the library.
from fasttsfeatures.core import TSFeatures
  • Instantiate TSFeatures introduce your api_id and api_key.
tsfeatures = TSFeatures(api_id=os.environ['API_ID'], 
                        api_key=os.environ['API_KEY'])
  • Run the process introducing your previous copied S3 uri.
response = tsfeatures.calculate_features_from_s3_uri(s3_uri='s3://tsfeatures-api-public/train.csv',
                                                     freq=7)
display_df(response)
status body id message
0 200 "s3://tsfeatures-api-public/features/features.csv" 740a410a-d138-41b4-8373-581710f020f8 Check job status at GET /tsfeatures/jobs/{job_id}
  • Monitor the process with the following code. Once it's done, access to your bucket to download the generated features.
job_id = response['id'].item()
display(tsfeatures.get_status(job_id))
<style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th {
    vertical-align: top;
}

.dataframe thead th {
    text-align: right;
}
</style>
status processing_time_seconds
0 InProgress 20

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fasttsfeatures-0.0.4.tar.gz (11.6 kB view hashes)

Uploaded Source

Built Distribution

fasttsfeatures-0.0.4-py3-none-any.whl (9.2 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page