A timeseries lib on top of fastai2
Project description
timeseries_fastai
This repository aims to implement TimeSeries classification/regression algorithms. It makes extensive use of fastai training methods.
Installation
In short, if you have anaconda, execute:
$ pip install timeseries_fastai
and you are good to go.
Time Series Classification from Scratch with Deep Neural Networks: A Strong Baseline
The original paper repo is here is implemented in Keras/Tf.
- Notebook 01: This is a basic notebook that implements the Deep Learning models proposed in Time Series Classification from Scratch with Deep Neural Networks: A Strong Baseline.
InceptionTime: Finding AlexNet for Time SeriesClassification
The original paper repo is here
- Notebook 02: Added InceptionTime architecture from InceptionTime: Finding AlexNet for Time SeriesClassification.
Results
You can run the benchmark using:
$python ucr.py --arch='inception' --tasks='all' --filename='inception.csv' --mixup=0.2
Default Values:
lr
= 1e-3opt
= 'ranger'epochs
= 40fp16
= True
import pandas as pd
from pathlib import Path
results_inception = pd.read_csv(Path.cwd().parent/'inception.csv', index_col=0)
results_inception.head(10)
<style scoped>
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
acc | acc_max | train_loss | val_loss | |
---|---|---|---|---|
task | ||||
Adiac | 0.83 | 0.83 | 1.54 | 1.31 |
ArrowHead | 0.84 | 0.89 | 0.47 | 0.60 |
Beef | 0.57 | 0.60 | 1.22 | 1.27 |
BeetleFly | 0.85 | 1.00 | 0.29 | 0.38 |
BirdChicken | 0.80 | 0.95 | 0.25 | 0.55 |
Car | 0.85 | 0.85 | 0.58 | 0.74 |
CBF | 0.99 | 1.00 | 0.44 | 0.37 |
ChlorineConcentration | 0.77 | 0.77 | 0.61 | 0.70 |
CinCECGTorso | 0.65 | 0.68 | 0.64 | 1.06 |
Coffee | 1.00 | 1.00 | 0.33 | 0.21 |
Getting Started
from timeseries_fastai.imports import *
from timeseries_fastai.core import *
from timeseries_fastai.data import *
from timeseries_fastai.models import *
ucr_path = untar_data(URLs.UCR)
df_train, df_test = load_df_ucr(ucr_path, 'StarLightCurves')
Loading files from: /home/tc256760/.fastai/data/Univariate2018_arff/StarLightCurves
df = stack_train_valid(df_train, df_test)
x_cols = df.columns[0:-2].to_list()
df.head()
<style scoped>
.dataframe tbody tr th:only-of-type {
vertical-align: middle;
}
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
att1 | att2 | att3 | att4 | att5 | att6 | att7 | att8 | att9 | att10 | ... | att1017 | att1018 | att1019 | att1020 | att1021 | att1022 | att1023 | att1024 | target | valid_col | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0 | 0.537303 | 0.531103 | 0.528503 | 0.529403 | 0.533603 | 0.540903 | 0.551103 | 0.564003 | 0.579603 | 0.597603 | ... | 0.545903 | 0.543903 | 0.541003 | 0.537203 | 0.532303 | 0.526403 | 0.519503 | 0.511403 | b'3' | False |
1 | 0.588398 | 0.593898 | 0.599098 | 0.604098 | 0.608798 | 0.613397 | 0.617797 | 0.622097 | 0.626097 | 0.630097 | ... | 0.246499 | 0.256199 | 0.266499 | 0.277399 | 0.288799 | 0.300899 | 0.313599 | 0.326899 | b'3' | False |
2 | -0.049900 | -0.041500 | -0.033400 | -0.025600 | -0.018100 | -0.010800 | -0.003800 | 0.003000 | 0.009600 | 0.015900 | ... | -0.161601 | -0.149201 | -0.136401 | -0.123201 | -0.109701 | -0.095901 | -0.081701 | -0.067100 | b'1' | False |
3 | 1.337005 | 1.319805 | 1.302905 | 1.286305 | 1.270005 | 1.254005 | 1.238304 | 1.223005 | 1.208104 | 1.193504 | ... | 1.298505 | 1.307705 | 1.316505 | 1.324905 | 1.332805 | 1.340205 | 1.347005 | 1.353205 | b'3' | False |
4 | 0.769801 | 0.775301 | 0.780401 | 0.785101 | 0.789401 | 0.793301 | 0.796801 | 0.799901 | 0.802601 | 0.805101 | ... | 0.744501 | 0.747301 | 0.750701 | 0.754801 | 0.759501 | 0.765001 | 0.771301 | 0.778401 | b'3' | False |
5 rows × 1026 columns
dls = TSDataLoaders.from_df(df, x_cols=x_cols, label_col='target', valid_col='valid_col', bs=16)
dls.vocab
(#3) [b'1',b'2',b'3']
dls.show_batch()
inception = create_inception(1, len(dls.vocab))
learn = Learner(dls, inception, metrics=[accuracy])
learn.fit_one_cycle(5, 1e-3)
epoch | train_loss | valid_loss | accuracy | time |
---|---|---|---|---|
0 | 0.536860 | 0.263519 | 0.901773 | 00:15 |
1 | 0.325206 | 0.252334 | 0.894488 | 00:15 |
2 | 0.214342 | 0.145046 | 0.954832 | 00:15 |
3 | 0.148120 | 0.120016 | 0.970495 | 00:15 |
4 | 0.114034 | 0.116838 | 0.970860 | 00:16 |
interp = ClassificationInterpretation.from_learner(learn)
interp.plot_confusion_matrix()
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
timeseries_fastai-0.0.2.tar.gz
(15.7 kB
view hashes)
Built Distribution
Close
Hashes for timeseries_fastai-0.0.2-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ec43d14b4ab859a538ae88b766f32212443930d955892b832f41eddfedf27dd6 |
|
MD5 | a6b4bae886c156b74efed681bed676a4 |
|
BLAKE2b-256 | 0a91e405ed66ffc2e07205e74d02cc8a30f54a270d57d5053a6bbda2245553c8 |