Skip to main content

Perform Efficient ML/DL Modelling

Project description

Logo

Downloads Python Versions

Cerbo

Cerbo means "brain" in Esperanto.

It is a high-level API wrapping Scikit-Learn, Tensorflow, and TensorFlow-Keras that allows for efficient machine learning and deep learning modelling and data preprocessing while enjoying large layers of abstraction. Cerbo was originally developed to help teach students the fundamental elements of machine learning and deep learning without requiring prerequisite knowledge in Python. It also allows students to train machine learning and deep learning models easily as there is in-built error proofing.

Install

There are two simple ways of installing Cerbo.

First, you can try:

pip install cerbo

or

python -m pip install cerbo

It is important to note that there are several packages that must already be installed to install Cerbo. The full list and versions can be found in requirements.txt, and nearly all can simply be installed through pip. If you are having trouble installing any of the prerequisite packages, a quick Google search and online coding forums such as StackOverFlow should explain how to install them correctly.

Writing your first program!

Currently, Cerbo performs efficient ML/DL modelling in a couple lines with limited preprocessing capabilites, we are adding new ones daily. Currently, to train a model from a CSV file all you have to do is call

import cerbo.preprocessing as cp
import cerbo.ML as cml

data, col_names = cp.load_custom_data("path_to_csv", "column_you_want_to_predict", num_features=4, id=False)

data is a dictionary containing X and y values, for training.

col_names is a list of features

Note: set id to true when there is an Id column in the CSV File, and set Num_Features to any value(as long it is within the # of colunns in the file"

After running this you will get 2 .png files labelled correlation, and features respectively.

  • correlation.png
    • Will show a correlation matrix of all of the features in the CSV file
  • feature.png
    • Will show a Pandas Scatter Matrix of with a N x N grid with N being num_features.

To train a model on this data just do

gb = cml.Boosting(task="r", data=data, algo="gb", seed=42)
cml.save_model(gb) 

Which quickly trains and saves a Gradient Boosting Regressor on this data.

You can also do

dt = cml.DecisionTree(task="c", data=data, seed=42)
cml.save_model(dt)

To train and save a quick Decision Tree Classifier.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

cerbo-0.3.3.tar.gz (16.4 kB view details)

Uploaded Source

Built Distribution

cerbo-0.3.3-py3-none-any.whl (15.9 kB view details)

Uploaded Python 3

File details

Details for the file cerbo-0.3.3.tar.gz.

File metadata

  • Download URL: cerbo-0.3.3.tar.gz
  • Upload date:
  • Size: 16.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.8.10

File hashes

Hashes for cerbo-0.3.3.tar.gz
Algorithm Hash digest
SHA256 0a54b165bcdfabe35d0a586b2f54af94605a6090bbc7532ef2fed0f204952365
MD5 effa4b15f1d6ea872786446ec2895fdc
BLAKE2b-256 620b8ee4a851f337faf5b8023c29a537ec3329877e2106318d01883d6e61423d

See more details on using hashes here.

File details

Details for the file cerbo-0.3.3-py3-none-any.whl.

File metadata

  • Download URL: cerbo-0.3.3-py3-none-any.whl
  • Upload date:
  • Size: 15.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.2 importlib_metadata/4.6.3 pkginfo/1.7.1 requests/2.22.0 requests-toolbelt/0.9.1 tqdm/4.62.0 CPython/3.8.10

File hashes

Hashes for cerbo-0.3.3-py3-none-any.whl
Algorithm Hash digest
SHA256 46b017704c9a8d55e2d1f7390a6de7c672d638e8569ca66ba1332d54caeda8cb
MD5 e8120e8f76e74ea8954f0bce2793d3a3
BLAKE2b-256 e4285105a6e3a78ca7266f5c187261d2c9e64edf660c2b1484d4c82687fce1d8

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page