Skip to main content

Python support for 'The Art and Science of Data Analytics'

Project description

AdvancedAnalytics

A collection of python modules, classes and methods for simplifying the use of machine learning solutions. AdvancedAnalytics provides easy access to advanced tools in Sci-Learn, NLTK and other machine learning packages. AdvancedAnalytics was developed to simplify learning python from the book The Art and Science of Data Analytics.

Description

From a high level view, building machine learning applications typically proceeds through three stages:

  1. Data Preprocessing

  2. Modeling or Analytics

  3. Postprocessing

The classes and methods in AdvancedAnalytics primarily support the first and last stages of machine learning applications.

Data scientists report they spend 80% of their total effort in first and last stages. The first stage, data preprocessing, is concerned with preparing the data for analysis. This includes:

  1. identifying and correcting outliers,

  2. imputing missing values, and

  3. encoding data.

The last stage, solution postprocessing, involves developing graphic summaries of the solution, and metrics for evaluating the quality of the solution.

Documentation and Examples

The API and documentation for all classes and examples are available at https://github.com/tandonneur/AdvancedAnalytics .

Usage

Currently the most popular usage is for supporting solutions developed using these advanced machine learning packages:

  • Sci-Learn

  • StatsModels

  • NLTK

The intention is to expand this list to other packages. This is a simple example for linear regression that uses the data map structure to preprocess data:

from AdvancedAnalytics.ReplaceImputeEncode import DT
from AdvancedAnalytics.ReplaceImputeEncode import ReplaceImputeEncode
from AdvancedAnalytics.Tree import tree_regressor
from sklearn.tree import DecisionTreeRegressor, export_graphviz
# Data Map Using DT, Data Types
data_map = {
    Salary:         [DT.Interval, (20000.0, 2000000.0)],
    Department:     [DT.Nominal, (HR, Sales, Marketing)]
    Classification: [DT.Nominal, (1, 2, 3, 4, 5)]
    Years:          [DT.Interval, (18, 60)] }
# Preprocess data from data frame df
rie = ReplaceImputeEncode(data_map=data_map, interval_scaling=None,
                          nominal_encoding= SAS, drop=True)
encoded_df = rie.fit_transform(df)
y = encoded_df[Salary]
X = encoded_df.drop(Salary, axis=1)
dt = DecisionTreeRegressor(criterion= gini, max_depth=4
                            min_samples_split=5, min_samples_leaf5)
dt = dt.fit(X,y)
tree_regressor.display_importance(dt, encoded_df.columns)
tree_regressor.display_metrics(dt, X, y)

Current Modules and Classes

ReplaceImputeEncode
Classes for Data Preprocessing
  • DT defines new data types used in the data dictionary

  • ReplaceImputeEncode a class for data preprocessing

Regression
Classes for Linear and Logistic Regression
  • linreg support for linear regressino

  • logreg support for logistic regression

  • stepwise a variable selection class

Tree
Classes for Decision Tree Solutions
  • tree_regressor support for regressor decision trees

  • tree_classifier support for classification decision trees

Forest
Classes for Random Forests
  • forest_regressor support for regressor random forests

  • forest_classifier support for classification random forests

NeuralNetwork
Classes for Neural Networks
  • nn_regressor support for regressor neural networks

  • nn_classifier support for classification neural networks

TextAnalytics
Classes for Text Analytics
  • text_analysis support for topic analysis

  • sentiment_analysis support for sentiment analysis

Internet
Classes for Internet Applications
  • scrape support for web scrapping

  • metrics a class for solution metrics

Installation and Dependencies

AdvancedAnalytics is designed to work on any operating system running python 3. It can be installed using pip or conda.

pip install AdvancedAnalytics
# or
conda install -c conda-forge AdvancedAnalytics
General Dependencies

There are dependencies. Most classes import one or more modules from Sci-Learn, referenced as sklearn in module imports, and StatsModels. These are both installed in with current versions of anaconda, a popular application for coding python solutions.

Decision Tree and Random Forest Dependencies

The Tree and Forest modules plot decision trees and importance metrics using pydotplus and the graphviz packages. If these are not installed and you are planning to use the Tree or Forest modules, they can be installed using the following code.

conda install -c conda-forge pydotplus
conda install -c conda-forge graphviz
pip install graphviz

One note, the second conda install does not complete the install of the graphviz package. To complete the graphviz install, it is necessary to run the pip install after the conda graphviz install.

Text Analytics Dependencies

The TextAnalytics module is based on the NLTK and Sci-Learn text analytics packages. They are both installed with the current version of anaconda.

However, TextAnalytics includes options to produce word clouds, which are graphic displays of the word collections associated with topic or data clusters. The wordcloud package is used to produce these graphs. If you are using the TextAnalytics module you can install the wordcloud package with the following code.

conda install -c conda-forge wordcloud

In addition, data used by the NLTK package is not automatically installed with this package. These data include the text dictionary and other data tables.

The following nltk.download commands should be run before using TextAnalytics. However, it is only necessary to run these once to download and install the data NLTK uses for text analytics.

#The following NLTK commands should be run once to
#download and install NLTK data.
nltk.download(punkt)
nltk.download(averaged_preceptron_tagger)
nltk.download(stopwords)
nltk.download(wordnet)
Internet Dependencies

The Internet module is contains a class scrape which has some functions for scraping newsfeeds. Some of these is based on the newspaper3k package. It can be installed using:

conda install -c conda-forge newspaper3k
# or
pip install newpaper3k

Code of Conduct

Everyone interacting in the AdvancedAnalytics project’s codebases, issue trackers, chat rooms, and mailing lists is expected to follow the PyPA Code of Conduct: https://www.pypa.io/en/latest/code-of-conduct/ .

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

AdvancedAnalytics-0.7.0.tar.gz (54.2 kB view details)

Uploaded Source

Built Distribution

AdvancedAnalytics-0.7.0-py3-none-any.whl (110.6 kB view details)

Uploaded Python 3

File details

Details for the file AdvancedAnalytics-0.7.0.tar.gz.

File metadata

  • Download URL: AdvancedAnalytics-0.7.0.tar.gz
  • Upload date:
  • Size: 54.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.3

File hashes

Hashes for AdvancedAnalytics-0.7.0.tar.gz
Algorithm Hash digest
SHA256 6f61ac1c7c900b9358bb01ca8f78fb0c2e594e2987dc4a6a122c6df31468880c
MD5 8146c9a59558771e898d10c669b6aa3a
BLAKE2b-256 cc0a5adf35a39eebf4b45096c33184a29ac627519c9392b9c7b7374e068d0d04

See more details on using hashes here.

File details

Details for the file AdvancedAnalytics-0.7.0-py3-none-any.whl.

File metadata

  • Download URL: AdvancedAnalytics-0.7.0-py3-none-any.whl
  • Upload date:
  • Size: 110.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.21.0 setuptools/41.2.0 requests-toolbelt/0.9.1 tqdm/4.31.1 CPython/3.7.3

File hashes

Hashes for AdvancedAnalytics-0.7.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7ee94813a5dfbede9c7e862b9b55afde908703350b9f05f76b143dace6ab46e9
MD5 6c25bfe382395d5e4a65463de0329cf4
BLAKE2b-256 411eb78ecf0c2756afcd71f319a3506c6ac2c8d8e5abc7ed0c61b3311232be6b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page