Traditional Machine Learning Models in PyTorch.
Project description
PyCave
PyCave allows you to run traditional machine learning models on CPU, GPU, and even on multiple
nodes. All models are implemented in PyTorch and provide an Estimator
API
that is fully compatible with scikit-learn.
For Gaussian mixture model, PyCave allows for 100x speed ups when using a GPU and enables to train on markedly larger datasets via mini-batch training. The full suite of benchmarks run to compare PyCave models against scikit-learn models is available on the documentation website.
PyCave version 3 is a complete rewrite of PyCave which is tested much more rigorously, depends on well-maintained libraries and is tuned for better performance. While you are, thus, highly encouraged to upgrade, refer to pycave-v2.borchero.com for documentation on PyCave 2.
Features
-
Support for GPU and multi-node training by implementing models in PyTorch and relying on PyTorch Lightning
-
Mini-batch training for all models such that they can be used on huge datasets
-
Well-structured implementation of models
- High-level
Estimator
API allows for easy usage such that models feel and behave like in scikit-learn - Medium-level
LightingModule
implements the training algorithm - Low-level PyTorch
Module
manages the model parameters
- High-level
Installation
PyCave is available via pip
:
pip install pycave
If you are using Poetry:
poetry add pycave
Usage
If you've ever used scikit-learn, you'll feel right at home when using PyCave. First, let's create some artificial data to work with:
import torch
X = torch.cat([
torch.randn(10000, 8) - 5,
torch.randn(10000, 8),
torch.randn(10000, 8) + 5,
])
This dataset consists of three clusters with 8-dimensional datapoints. If you want to fit a K-Means model, to find the clusters' centroids, it's as easy as:
from pycave.clustering import KMeans
estimator = KMeans(3)
estimator.fit(X)
# Once the estimator is fitted, it provides various properties. One of them is
# the `model_` property which yields the PyTorch module with the fitted parameters.
print("Centroids are:")
print(estimator.model_.centroids)
Due to the high-level estimator API, the usage for all machine learning models is similar. The API documentation provides more detailed information about parameters that can be passed to estimators and which methods are available.
GPU and Multi-Node training
For GPU- and multi-node training, PyCave leverages PyTorch Lightning. The hardware that training runs on is determined by the Trainer class. It's init method provides various configuration options.
If you want to run K-Means with a GPU, you can pass the options accelerator='gpu'
and devices=1
to the estimator's initializer:
estimator = KMeans(3, trainer_params=dict(accelerator='gpu', devices=1))
Similarly, if you want to train on 4 nodes simultaneously where each node has one GPU available, you can specify this as follows:
estimator = KMeans(3, trainer_params=dict(num_nodes=4, accelerator='gpu', devices=1))
In fact, you do not need to change anything else in your code.
Implemented Models
Currently, PyCave implements three different models:
License
PyCave is licensed under the MIT License.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file pycave-3.2.1.tar.gz
.
File metadata
- Download URL: pycave-3.2.1.tar.gz
- Upload date:
- Size: 28.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.1 CPython/3.8.16 Linux/5.15.0-1024-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2d4010289035ff047abaf423532ee87afb329e92d6f21b6b777daf468a458a54 |
|
MD5 | 73f0bd3bcd1d73ab8309cce33c92bcf6 |
|
BLAKE2b-256 | b3f8b60008b98a741c3576d2ca990848085d59defe6bca9bc71373fdb864ad1f |
File details
Details for the file pycave-3.2.1-py3-none-any.whl
.
File metadata
- Download URL: pycave-3.2.1-py3-none-any.whl
- Upload date:
- Size: 37.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.3.1 CPython/3.8.16 Linux/5.15.0-1024-azure
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 48a2fd69bcd2ec04833f709c8fa9651d41c4be5a64835f0af5b8637f9932090a |
|
MD5 | a49559b2d8527f5806642cc5f292fab5 |
|
BLAKE2b-256 | d022176d0dca19a4ad99e363f00a30402deea45b258b6042f8c09741f3566320 |