Skip to main content

Library/framework for making predictions.

Project description

mydatapreprocessing

Python versions PyPI version Language grade: Python Build Status Documentation Status License: MIT codecov

Library contain 3 modules - preprocessing, inputs and generatedata.

Installation

Python >=3.6 (Python 2 is not supported).

Install just with

pip install mydatapreprocessing

There are some libraries that not every user will be using (for some data inputs). If you want to be sure to have all libraries, you can download requirements_advanced.txt and then install advanced requirements with pip install -r requirements_advanced.txt.

Preprocessing

Load data from web link or local file (json, csv, excel file, parquet, h5...), consolidate it (to pandas dataframe) and do preprocessing like resampling, standardization, string embedding, new columns derivation. If you want to see how functions work - working examples with printed results are in tests - visual.py.

There are many small functions, but there they are called automatically with main preprocess functions.

- load_data
- data_consolidation
- preprocess_data
- preprocess_data_inverse

Note: In data consolidation, predicted column is moved on index 0 !!!

Example

import mydatapreprocessing.preprocessing as mdpp

You can use local files as well as web urls

# data_from_file = mdpp.load_data(PATH_TO_FILE.csv)

data_from_url = mdpp.load_data(
    "https://blockchain.info/unconfirmed-transactions?format=json",
    request_datatype_suffix=".json",
    data_orientation="index",
    predicted_table="txs",
)  

Transform various data into defined format - pandas dataframe - convert to numeric if possible, keep only numeric data and resample ifg configured. It return array, dataframe

data_consolidated = mdpp.data_consolidation(
    data_from_url, predicted_column="weight", remove_nans_threshold=0.9, remove_nans_or_replace="interpolate"
)

preprocess_data returns preprocessed data, but also last undifferenced value and scaler for inverse transformation, so unpack it with _

data_preprocessed, _, _ = mdpp.preprocess_data( data_consolidated, remove_outliers=True, smoothit=False, correlation_threshold=False, data_transform=False, standardizeit="standardize", )

Allowed data formats for load_data are examples

myarray_or_dataframe # Numpy array or Pandas.DataFrame
r"/home/user/my.json" # Local file. The same with .parquet, .h5, .json or .xlsx.
"https://yoururl/your.csv" # Web url (with suffix). Same with json.
"https://blockchain.info/unconfirmed-transactions?format=json" # In this case you have to specify
    also 'request_datatype_suffix': "json", 'data_orientation': "index", 'predicted_table': 'txs',
[{'col_1': 3, 'col_2': 'a'}, {'col_1': 0, 'col_2': 'd'}] # List of records
{'col_1': [3, 2, 1, 0], 'col_2': ['a', 'b', 'c', 'd']} # Dict with colums or rows (index) - necessary
    to setup data_orientation!

You can use more files in list and data will be concatenated. It can be list of paths or list of python objects. For example::

[{'col_1': 3, 'col_2': 'a'}, {'col_1': 0, 'col_2': 'd'}]  # List of records
[np.random.randn(20, 3), np.random.randn(25, 3)]  # Dataframe same way
["https://raw.githubusercontent.com/jbrownlee/Datasets/master/daily-min-temperatures.csv",
    "https://raw.githubusercontent.com/jbrownlee/Datasets/master/daily-min-temperatures.csv"]  # List of URLs
["path/to/my1.csv", "path/to/my1.csv"]

On windows it's necessary to use raw string - 'r' in front of string because of escape symbols \

Inputs

It take tabular time series data and put it into format (input vector X, output vector y and input for predicted value x_input) that can be inserted into machine learning models for example on sklearn or tensorflow. It contain functions make_sequences, create_inputs and create_tests_outputs

Example

import mydatapreprocessing.inputs as mdpi

data = np.array([[1, 3, 5, 2, 3, 4, 5, 66, 3]]).T
seqs, Y, x_input, test_inputs = mdpi.inputs.make_sequences(data, predicts=7, repeatit=3, n_steps_in=6, n_steps_out=1, constant=1)

generatedata

This module generate data that can be used for example for validating machine learning time series prediction results. It can define data like sig, sign, ramp signal or download ECG heart signal.

Example

import mydatapreprocessing as mdp

data = mdp.generatedata.gen_sin(1000)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mydatapreprocessing-1.1.37.tar.gz (25.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mydatapreprocessing-1.1.37-py3-none-any.whl (25.0 kB view details)

Uploaded Python 3

File details

Details for the file mydatapreprocessing-1.1.37.tar.gz.

File metadata

  • Download URL: mydatapreprocessing-1.1.37.tar.gz
  • Upload date:
  • Size: 25.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.57.0 CPython/3.9.2

File hashes

Hashes for mydatapreprocessing-1.1.37.tar.gz
Algorithm Hash digest
SHA256 430d935b6fc2f8f63e6c4df1b1204217c2be7b03b0c5789529c172c028b047d1
MD5 4c78096ea386c63de09f681281cbe5b1
BLAKE2b-256 a4c7961fb0015250cba12f130dd897130cd42347677cb844e480ca434256febf

See more details on using hashes here.

File details

Details for the file mydatapreprocessing-1.1.37-py3-none-any.whl.

File metadata

  • Download URL: mydatapreprocessing-1.1.37-py3-none-any.whl
  • Upload date:
  • Size: 25.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.7.0 requests/2.25.1 setuptools/49.2.1 requests-toolbelt/0.9.1 tqdm/4.57.0 CPython/3.9.2

File hashes

Hashes for mydatapreprocessing-1.1.37-py3-none-any.whl
Algorithm Hash digest
SHA256 f58f16818fa9da1fc0f1309aadac09885935dd2c44ddc3581d6e15bf6c5c5e7c
MD5 8a4dfe4dc6963fb57961062cc65ddef0
BLAKE2b-256 e41a6410f1af9269d1d2ab75c4993756d175d8f07481873811af53dac2a925a5

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page