Skip to main content

An open source DataOps, MLOps platform for humans

Project description

omega|ml - MLOps for humans

with just a single line of code you can

  • deploy machine learning models straight from Jupyter Notebook (or any other code)

  • implement data pipelines quickly, without memory limitation, all from a Pandas-like API

  • serve models and data from an easy to use REST API

Further, omega|ml is the fastest way to

  • scale model training on the included scalable pure-Python compute cluster, on Spark or any other cloud

  • collaborate on data science projects easily, sharing Jupyter Notebooks

  • deploy beautiful dashboards right from your Jupyter Notebook, using dashserve

Quick start

Start the omega|ml server right from your laptop or virtual machine

$ wget https://raw.githubusercontent.com/omegaml/omegaml/master/docker-compose.yml
$ docker-compose up -d

Jupyter Notebook is immediately available at http://localhost:8899 (omegamlisfun to login). Any notebook you create will automatically be stored in the integrated omega|ml database, making collaboration a breeze. The REST API is available at http://localhost:5000.

Already have a Python environment (e.g. Jupyter Notebook)? Leverage the power of omega|ml by installing as follows:

# assuming you have started the server as per above
$ pip install omegaml

Further information

Examples

# transparently store Pandas Series and DataFrames or any Python object
om.datasets.put(df, 'stats')
om.datasets.get('stats', sales__gte=100)

# transparently store and get models
clf = LogisticRegression()
om.models.put(clf, 'forecast')
clf = om.models.get('forecast')

# run and scale models directly on the integrated Python or Spark compute cluster
om.runtime.model('forecast').fit('stats[^sales]', 'stats[sales]')
om.runtime.model('forecast').predict('stats')
om.runtime.model('forecast').gridsearch(X, Y)

# use the REST API to store and retrieve data, run predictions
requests.put('/v1/dataset/stats', json={...})
requests.get('/v1/dataset/stats?sales__gte=100')
requests.put('/v1/model/forecast', json={...})

Use Cases

omega|ml currently supports scikit-learn, Keras and Tensorflow out of the box. Need to deploy a model from another framework? Open an issue at https://github.com/omegaml/omegaml/issues or drop us a line at support@omegaml.io

Machine Learning Deployment

  • deploy models to production with a single line of code

  • serve and use models or datasets from a REST API

Data Science Collaboration

  • get a fully integrated data science workplace within minutes

  • easily share models, data, jupyter notebooks and reports with your collaborators

Centralized Data & Compute cluster

  • perform out-of-core computations on a pure-python or Apache Spark compute cluster

  • have a shared NoSQL database (MongoDB), out of the box, working like a Pandas dataframe

  • use a compute cluster to train your models with no additional setup

Scalability and Extensibility

  • scale your data science work from your laptop to team to production with no code changes

  • integrate any machine learning framework or third party data science platform with a common API

Towards Data Science recently published an article on omega|ml: https://towardsdatascience.com/omega-ml-deploying-data-machine-learning-pipelines-the-easy-way-a3d281569666

In addition omega|ml provides an easy-to-use extensions API to support any kind of models, compute cluster, database and data source.

Commercial Edition & Support

https://omegaml.io

omega|ml Commercial Edition provides security on every level and is ready made for Kubernetes deployment. It is licensed separately for on-premise, private or hybrid cloud. Sign up at https://omegaml.io

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

omegaml-0.16.3.tar.gz (4.9 MB view details)

Uploaded Source

Built Distributions

omegaml-0.16.3-py311-none-any.whl (3.7 MB view details)

Uploaded Python 3.11

omegaml-0.16.3-py310-none-any.whl (3.7 MB view details)

Uploaded Python 3.10

File details

Details for the file omegaml-0.16.3.tar.gz.

File metadata

  • Download URL: omegaml-0.16.3.tar.gz
  • Upload date:
  • Size: 4.9 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.14

File hashes

Hashes for omegaml-0.16.3.tar.gz
Algorithm Hash digest
SHA256 c2ba984982742f9f69156994ef0e25d549d2b14437275564d3b6cfb6f351a0b3
MD5 9a376f0908984750759b3c9bfc85a413
BLAKE2b-256 b42888a18a78eb0b213948330cea78ddc8078e957850713fc9bc19c556a508ea

See more details on using hashes here.

File details

Details for the file omegaml-0.16.3-py311-none-any.whl.

File metadata

  • Download URL: omegaml-0.16.3-py311-none-any.whl
  • Upload date:
  • Size: 3.7 MB
  • Tags: Python 3.11
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.9

File hashes

Hashes for omegaml-0.16.3-py311-none-any.whl
Algorithm Hash digest
SHA256 fe609acab32df47a4d9aff87f53a17fb5757444a9bd9d2d72cd1c48727d57b38
MD5 d5ec5f01f84399ec2c7ebcc33ede92b2
BLAKE2b-256 4627ea246e696582c2cb3864783d25ba473e5872125d688f883c5ec7927b7d04

See more details on using hashes here.

File details

Details for the file omegaml-0.16.3-py310-none-any.whl.

File metadata

  • Download URL: omegaml-0.16.3-py310-none-any.whl
  • Upload date:
  • Size: 3.7 MB
  • Tags: Python 3.10
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.10.14

File hashes

Hashes for omegaml-0.16.3-py310-none-any.whl
Algorithm Hash digest
SHA256 6a22beb05569cfc2becdfdb10cee7a9eaeee9aece0f7a0154681959e35feda2b
MD5 f0d9aff1fc352cb5a53cf7fa17dde399
BLAKE2b-256 01eddf45b2629173f0b967dad04487e5e8636323318686303f8b89bc4be34df2

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page