Skip to main content

A system for parallel and distributed Python that unifies the ML ecosystem.

Project description

https://github.com/ray-project/ray/raw/master/doc/source/images/ray_header_logo.png https://travis-ci.com/ray-project/ray.svg?branch=master https://readthedocs.org/projects/ray/badge/?version=latest

Ray is a fast and simple framework for building and running distributed applications.

Ray is packaged with the following libraries for accelerating machine learning workloads:

  • Tune: Scalable Hyperparameter Tuning

  • RLlib: Scalable Reinforcement Learning

  • RaySGD: Distributed Training Wrappers

  • Ray Serve: Scalable and Programmable Serving

Install Ray with: pip install ray. For nightly wheels, see the Installation page.

NOTE: As of Ray 0.8.1, Python 2 is no longer supported.

Quick Start

Execute Python functions in parallel.

import ray
ray.init()

@ray.remote
def f(x):
    return x * x

futures = [f.remote(i) for i in range(4)]
print(ray.get(futures))

To use Ray’s actor model:

import ray
ray.init()

@ray.remote
class Counter(object):
    def __init__(self):
        self.n = 0

    def increment(self):
        self.n += 1

    def read(self):
        return self.n

counters = [Counter.remote() for i in range(4)]
[c.increment.remote() for c in counters]
futures = [c.read.remote() for c in counters]
print(ray.get(futures))

Ray programs can run on a single machine, and can also seamlessly scale to large clusters. To execute the above Ray script in the cloud, just download this configuration file, and run:

ray submit [CLUSTER.YAML] example.py --start

Read more about launching clusters.

Tune Quick Start

https://github.com/ray-project/ray/raw/master/doc/source/images/tune-wide.png

Tune is a library for hyperparameter tuning at any scale.

To run this example, you will need to install the following:

$ pip install ray[tune]

This example runs a parallel grid search to optimize an example objective function.

from ray import tune


def objective(step, alpha, beta):
    return (0.1 + alpha * step / 100)**(-1) + beta * 0.1


def training_function(config):
    # Hyperparameters
    alpha, beta = config["alpha"], config["beta"]
    for step in range(10):
        # Iterative training function - can be any arbitrary training procedure.
        intermediate_score = objective(step, alpha, beta)
        # Feed the score back back to Tune.
        tune.report(mean_loss=intermediate_score)


analysis = tune.run(
    training_function,
    config={
        "alpha": tune.grid_search([0.001, 0.01, 0.1]),
        "beta": tune.choice([1, 2, 3])
    })

print("Best config: ", analysis.get_best_config(metric="mean_loss"))

# Get a dataframe for analyzing trial results.
df = analysis.dataframe()

If TensorBoard is installed, automatically visualize all trial results:

tensorboard --logdir ~/ray_results

RLlib Quick Start

https://github.com/ray-project/ray/raw/master/doc/source/images/rllib-wide.jpg

RLlib is an open-source library for reinforcement learning built on top of Ray that offers both high scalability and a unified API for a variety of applications.

pip install tensorflow  # or tensorflow-gpu
pip install ray[rllib]  # also recommended: ray[debug]
import gym
from gym.spaces import Discrete, Box
from ray import tune

class SimpleCorridor(gym.Env):
    def __init__(self, config):
        self.end_pos = config["corridor_length"]
        self.cur_pos = 0
        self.action_space = Discrete(2)
        self.observation_space = Box(0.0, self.end_pos, shape=(1, ))

    def reset(self):
        self.cur_pos = 0
        return [self.cur_pos]

    def step(self, action):
        if action == 0 and self.cur_pos > 0:
            self.cur_pos -= 1
        elif action == 1:
            self.cur_pos += 1
        done = self.cur_pos >= self.end_pos
        return [self.cur_pos], 1 if done else 0, done, {}

tune.run(
    "PPO",
    config={
        "env": SimpleCorridor,
        "num_workers": 4,
        "env_config": {"corridor_length": 5}})

Ray Serve Quick Start

https://raw.githubusercontent.com/ray-project/ray/master/doc/source/serve/logo.svg

Ray Serve is a scalable model-serving library built on Ray. It is:

  • Framework Agnostic: Use the same toolkit to serve everything from deep learning models built with frameworks like PyTorch or Tensorflow & Keras to Scikit-Learn models or arbitrary business logic.

  • Python First: Configure your model serving with pure Python code - no more YAMLs or JSON configs.

  • Performance Oriented: Turn on batching, pipelining, and GPU acceleration to increase the throughput of your model.

  • Composition Native: Allow you to create “model pipelines” by composing multiple models together to drive a single prediction.

  • Horizontally Scalable: Serve can linearly scale as you add more machines. Enable your ML-powered service to handle growing traffic.

To run this example, you will need to install the following:

$ pip install scikit-learn
$ pip install "ray[serve]"

This example runs serves a scikit-learn gradient boosting classifier.

from ray import serve
import pickle
import requests
from sklearn.datasets import load_iris
from sklearn.ensemble import GradientBoostingClassifier

# Train model
iris_dataset = load_iris()
model = GradientBoostingClassifier()
model.fit(iris_dataset["data"], iris_dataset["target"])

# Define Ray Serve model,
class BoostingModel:
    def __init__(self):
        self.model = model
        self.label_list = iris_dataset["target_names"].tolist()

    def __call__(self, flask_request):
        payload = flask_request.json["vector"]
        print("Worker: received flask request with data", payload)

        prediction = self.model.predict([payload])[0]
        human_name = self.label_list[prediction]
        return {"result": human_name}


# Deploy model
serve.init()
serve.create_backend("iris:v1", BoostingModel)
serve.create_endpoint("iris_classifier", backend="iris:v1", route="/iris")

# Query it!
sample_request_input = {"vector": [1.2, 1.0, 1.1, 0.9]}
response = requests.get("http://localhost:8000/iris", json=sample_request_input)
print(response.text)
# Result:
# {
#  "result": "versicolor"
# }

More Information

Getting Involved

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ray-0.8.7-cp38-cp38-win_amd64.whl (14.1 MB view details)

Uploaded CPython 3.8Windows x86-64

ray-0.8.7-cp38-cp38-manylinux1_x86_64.whl (22.1 MB view details)

Uploaded CPython 3.8

ray-0.8.7-cp38-cp38-macosx_10_13_x86_64.whl (55.3 MB view details)

Uploaded CPython 3.8macOS 10.13+ x86-64

ray-0.8.7-cp37-cp37m-win_amd64.whl (14.2 MB view details)

Uploaded CPython 3.7mWindows x86-64

ray-0.8.7-cp37-cp37m-manylinux1_x86_64.whl (22.0 MB view details)

Uploaded CPython 3.7m

ray-0.8.7-cp37-cp37m-macosx_10_13_intel.whl (55.4 MB view details)

Uploaded CPython 3.7mmacOS 10.13+ Intel (x86-64, i386)

ray-0.8.7-cp36-cp36m-win_amd64.whl (14.2 MB view details)

Uploaded CPython 3.6mWindows x86-64

ray-0.8.7-cp36-cp36m-manylinux1_x86_64.whl (22.0 MB view details)

Uploaded CPython 3.6m

ray-0.8.7-cp36-cp36m-macosx_10_13_intel.whl (55.5 MB view details)

Uploaded CPython 3.6mmacOS 10.13+ Intel (x86-64, i386)

File details

Details for the file ray-0.8.7-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: ray-0.8.7-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 14.1 MB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.6

File hashes

Hashes for ray-0.8.7-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 7df38c686e96c6782d9f582b901070721768318adb1eb2e0f8b537742dca78e6
MD5 1c08ad6d6dc750b5b83cacf54e4ccc92
BLAKE2b-256 87c6b2b1d61e091f8d6585f1c4f9d5a222d220212579c99c72f3ffdc7ceaed2d

See more details on using hashes here.

File details

Details for the file ray-0.8.7-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: ray-0.8.7-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 22.1 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.6

File hashes

Hashes for ray-0.8.7-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 7567bf67cacf0fc8174fdd1073881716943b9dc40821a276dcfac9cc7c893aa5
MD5 a6899d93ae9729ba63e6949f0462d9e2
BLAKE2b-256 cca67249aebe16a88ff14cfea3299fbc679c57acada878fe84f0384f7f021be0

See more details on using hashes here.

File details

Details for the file ray-0.8.7-cp38-cp38-macosx_10_13_x86_64.whl.

File metadata

  • Download URL: ray-0.8.7-cp38-cp38-macosx_10_13_x86_64.whl
  • Upload date:
  • Size: 55.3 MB
  • Tags: CPython 3.8, macOS 10.13+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.6

File hashes

Hashes for ray-0.8.7-cp38-cp38-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 68cadb48978c8e74247ff7fe609e327140116e99af5edeea8a21f3481f29cae8
MD5 e1fb8260f72bb2bbc5bb26dda2dd37ee
BLAKE2b-256 082e20267222a34c574815b83117759e6f3fcc3a9d4061b442ea4a9d7b514a11

See more details on using hashes here.

File details

Details for the file ray-0.8.7-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: ray-0.8.7-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 14.2 MB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.6

File hashes

Hashes for ray-0.8.7-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 629e5e7e87ae68099f439c0d5be40412c50548505f7c650ec5d825204c428be1
MD5 71ff232f91dd73c1d26847d0f6eb7630
BLAKE2b-256 e2bd8dbe8a02c7a56b11554fce6da92151d68e508b6c57809693a4c5170b975a

See more details on using hashes here.

File details

Details for the file ray-0.8.7-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: ray-0.8.7-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 22.0 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.6

File hashes

Hashes for ray-0.8.7-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 2db0e331f4896663afca6e219cb7a476c21f2c873f4f342d54b2c160944c578b
MD5 7e5ca555c7e627fe328f3d1aaf1fb556
BLAKE2b-256 c56f3c040c1f29ab11163c89d296b6e8bf09448749d2201207f7e0a33050545a

See more details on using hashes here.

File details

Details for the file ray-0.8.7-cp37-cp37m-macosx_10_13_intel.whl.

File metadata

  • Download URL: ray-0.8.7-cp37-cp37m-macosx_10_13_intel.whl
  • Upload date:
  • Size: 55.4 MB
  • Tags: CPython 3.7m, macOS 10.13+ Intel (x86-64, i386)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.6

File hashes

Hashes for ray-0.8.7-cp37-cp37m-macosx_10_13_intel.whl
Algorithm Hash digest
SHA256 6440b640866d0f89f72fafb355124d856e2f7aabe3af4902a9bf7ca76426fc41
MD5 cbf8ea9387b7c4270d708d319356128f
BLAKE2b-256 3156880421c49947916bfdbbcb4bd0122d3c0576711f420262df5650159c606a

See more details on using hashes here.

File details

Details for the file ray-0.8.7-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: ray-0.8.7-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 14.2 MB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.6

File hashes

Hashes for ray-0.8.7-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 f247c836a633ca6894d8a48724bba14dd14ce8f9f13fe3182033dfdd4c3ada65
MD5 0ff245d171f5cd0159dd28776b799586
BLAKE2b-256 cb39f489a39ce89a31687717b352b7073c4f0168f4f0bdf04ba43bfabfd8e623

See more details on using hashes here.

File details

Details for the file ray-0.8.7-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: ray-0.8.7-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 22.0 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.6

File hashes

Hashes for ray-0.8.7-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 fb2d38b0b95ed3c61071b7cd1876c255bc7697b724a0f8988c6a3bd954899bff
MD5 2ebe6f0f3f50ede4dc1f5900670a45e4
BLAKE2b-256 fac73fb709223d1eae040845abb1bc825d76c3f18ade046063382bc6ace603ef

See more details on using hashes here.

File details

Details for the file ray-0.8.7-cp36-cp36m-macosx_10_13_intel.whl.

File metadata

  • Download URL: ray-0.8.7-cp36-cp36m-macosx_10_13_intel.whl
  • Upload date:
  • Size: 55.5 MB
  • Tags: CPython 3.6m, macOS 10.13+ Intel (x86-64, i386)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.1.1 pkginfo/1.5.0.1 requests/2.23.0 setuptools/41.0.1 requests-toolbelt/0.9.1 tqdm/4.38.0 CPython/3.7.6

File hashes

Hashes for ray-0.8.7-cp36-cp36m-macosx_10_13_intel.whl
Algorithm Hash digest
SHA256 357e9ffdd1bd9efc69b21ff8710bec8ad3a37c26cc53b047f0520217d0093016
MD5 80c1c48a78a7b6b9e0a5803b6e46a987
BLAKE2b-256 87f1523ea16f53a4f50cf19001f712916e576f22672fc085eb0d88634d9adc0a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page