Skip to main content

A system for parallel and distributed Python that unifies the ML ecosystem.

Project description

https://github.com/ray-project/ray/raw/master/doc/source/images/ray_header_logo.png https://readthedocs.org/projects/ray/badge/?version=latest https://img.shields.io/badge/Ray-Join%20Slack-blue

Ray provides a simple, universal API for building distributed applications.

Ray is packaged with the following libraries for accelerating machine learning workloads:

  • Tune: Scalable Hyperparameter Tuning

  • RLlib: Scalable Reinforcement Learning

  • RaySGD: Distributed Training Wrappers

  • Ray Serve: Scalable and Programmable Serving

There are also many community integrations with Ray, including Dask, MARS, Modin, Horovod, Hugging Face, Scikit-learn, and others. Check out the full list of Ray distributed libraries here.

Install Ray with: pip install ray. For nightly wheels, see the Installation page.

Quick Start

Execute Python functions in parallel.

import ray
ray.init()

@ray.remote
def f(x):
    return x * x

futures = [f.remote(i) for i in range(4)]
print(ray.get(futures))

To use Ray’s actor model:

import ray
ray.init()

@ray.remote
class Counter(object):
    def __init__(self):
        self.n = 0

    def increment(self):
        self.n += 1

    def read(self):
        return self.n

counters = [Counter.remote() for i in range(4)]
[c.increment.remote() for c in counters]
futures = [c.read.remote() for c in counters]
print(ray.get(futures))

Ray programs can run on a single machine, and can also seamlessly scale to large clusters. To execute the above Ray script in the cloud, just download this configuration file, and run:

ray submit [CLUSTER.YAML] example.py --start

Read more about launching clusters.

Tune Quick Start

https://github.com/ray-project/ray/raw/master/doc/source/images/tune-wide.png

Tune is a library for hyperparameter tuning at any scale.

To run this example, you will need to install the following:

$ pip install "ray[tune]"

This example runs a parallel grid search to optimize an example objective function.

from ray import tune


def objective(step, alpha, beta):
    return (0.1 + alpha * step / 100)**(-1) + beta * 0.1


def training_function(config):
    # Hyperparameters
    alpha, beta = config["alpha"], config["beta"]
    for step in range(10):
        # Iterative training function - can be any arbitrary training procedure.
        intermediate_score = objective(step, alpha, beta)
        # Feed the score back back to Tune.
        tune.report(mean_loss=intermediate_score)


analysis = tune.run(
    training_function,
    config={
        "alpha": tune.grid_search([0.001, 0.01, 0.1]),
        "beta": tune.choice([1, 2, 3])
    })

print("Best config: ", analysis.get_best_config(metric="mean_loss"))

# Get a dataframe for analyzing trial results.
df = analysis.results_df

If TensorBoard is installed, automatically visualize all trial results:

tensorboard --logdir ~/ray_results

RLlib Quick Start

https://github.com/ray-project/ray/raw/master/doc/source/images/rllib-wide.jpg

RLlib is an open-source library for reinforcement learning built on top of Ray that offers both high scalability and a unified API for a variety of applications.

pip install tensorflow  # or tensorflow-gpu
pip install "ray[rllib]"  # also recommended: ray[debug]
import gym
from gym.spaces import Discrete, Box
from ray import tune

class SimpleCorridor(gym.Env):
    def __init__(self, config):
        self.end_pos = config["corridor_length"]
        self.cur_pos = 0
        self.action_space = Discrete(2)
        self.observation_space = Box(0.0, self.end_pos, shape=(1, ))

    def reset(self):
        self.cur_pos = 0
        return [self.cur_pos]

    def step(self, action):
        if action == 0 and self.cur_pos > 0:
            self.cur_pos -= 1
        elif action == 1:
            self.cur_pos += 1
        done = self.cur_pos >= self.end_pos
        return [self.cur_pos], 1 if done else 0, done, {}

tune.run(
    "PPO",
    config={
        "env": SimpleCorridor,
        "num_workers": 4,
        "env_config": {"corridor_length": 5}})

Ray Serve Quick Start

https://raw.githubusercontent.com/ray-project/ray/master/doc/source/serve/logo.svg

Ray Serve is a scalable model-serving library built on Ray. It is:

  • Framework Agnostic: Use the same toolkit to serve everything from deep learning models built with frameworks like PyTorch or Tensorflow & Keras to Scikit-Learn models or arbitrary business logic.

  • Python First: Configure your model serving with pure Python code - no more YAMLs or JSON configs.

  • Performance Oriented: Turn on batching, pipelining, and GPU acceleration to increase the throughput of your model.

  • Composition Native: Allow you to create “model pipelines” by composing multiple models together to drive a single prediction.

  • Horizontally Scalable: Serve can linearly scale as you add more machines. Enable your ML-powered service to handle growing traffic.

To run this example, you will need to install the following:

$ pip install scikit-learn
$ pip install "ray[serve]"

This example runs serves a scikit-learn gradient boosting classifier.

from ray import serve
import pickle
import requests
from sklearn.datasets import load_iris
from sklearn.ensemble import GradientBoostingClassifier

# Train model
iris_dataset = load_iris()
model = GradientBoostingClassifier()
model.fit(iris_dataset["data"], iris_dataset["target"])

# Define Ray Serve model,
class BoostingModel:
    def __init__(self):
        self.model = model
        self.label_list = iris_dataset["target_names"].tolist()

    def __call__(self, flask_request):
        payload = flask_request.json["vector"]
        print("Worker: received flask request with data", payload)

        prediction = self.model.predict([payload])[0]
        human_name = self.label_list[prediction]
        return {"result": human_name}


# Deploy model
serve.init()
serve.create_backend("iris:v1", BoostingModel)
serve.create_endpoint("iris_classifier", backend="iris:v1", route="/iris")

# Query it!
sample_request_input = {"vector": [1.2, 1.0, 1.1, 0.9]}
response = requests.get("http://localhost:8000/iris", json=sample_request_input)
print(response.text)
# Result:
# {
#  "result": "versicolor"
# }

More Information

Older documents:

Getting Involved

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ray-1.0.1.post1-cp38-cp38-win_amd64.whl (14.8 MB view details)

Uploaded CPython 3.8Windows x86-64

ray-1.0.1.post1-cp38-cp38-manylinux1_x86_64.whl (23.1 MB view details)

Uploaded CPython 3.8

ray-1.0.1.post1-cp38-cp38-macosx_10_13_x86_64.whl (48.1 MB view details)

Uploaded CPython 3.8macOS 10.13+ x86-64

ray-1.0.1.post1-cp37-cp37m-win_amd64.whl (14.9 MB view details)

Uploaded CPython 3.7mWindows x86-64

ray-1.0.1.post1-cp37-cp37m-manylinux1_x86_64.whl (23.1 MB view details)

Uploaded CPython 3.7m

ray-1.0.1.post1-cp37-cp37m-macosx_10_13_intel.whl (48.3 MB view details)

Uploaded CPython 3.7mmacOS 10.13+ Intel (x86-64, i386)

ray-1.0.1.post1-cp36-cp36m-win_amd64.whl (14.9 MB view details)

Uploaded CPython 3.6mWindows x86-64

ray-1.0.1.post1-cp36-cp36m-manylinux1_x86_64.whl (23.1 MB view details)

Uploaded CPython 3.6m

ray-1.0.1.post1-cp36-cp36m-macosx_10_13_intel.whl (48.3 MB view details)

Uploaded CPython 3.6mmacOS 10.13+ Intel (x86-64, i386)

File details

Details for the file ray-1.0.1.post1-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: ray-1.0.1.post1-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 14.8 MB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.3

File hashes

Hashes for ray-1.0.1.post1-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 105374a6551a18058cfe77dc09042b8f3df1d56d801db4b6a93ee7b473fc0e94
MD5 924b675b055c1b473e23a7d079d782c9
BLAKE2b-256 c6234e92ff0f09af1f5a63f2af3d08a7dc1162ff0629a98febadfbfaa61ad16c

See more details on using hashes here.

File details

Details for the file ray-1.0.1.post1-cp38-cp38-manylinux1_x86_64.whl.

File metadata

  • Download URL: ray-1.0.1.post1-cp38-cp38-manylinux1_x86_64.whl
  • Upload date:
  • Size: 23.1 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.3

File hashes

Hashes for ray-1.0.1.post1-cp38-cp38-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 8323a41ff2b6abdebc7220226c6267e0a4b68c3e29f6cd4797d2046143a9cfc7
MD5 a573b1a88686b0b3abe9f1cd57eb977c
BLAKE2b-256 107c9b4fccf414e45fa80a073c92a57e7e8ac1df41702c70b84d02fff1da0f6c

See more details on using hashes here.

File details

Details for the file ray-1.0.1.post1-cp38-cp38-macosx_10_13_x86_64.whl.

File metadata

  • Download URL: ray-1.0.1.post1-cp38-cp38-macosx_10_13_x86_64.whl
  • Upload date:
  • Size: 48.1 MB
  • Tags: CPython 3.8, macOS 10.13+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.3

File hashes

Hashes for ray-1.0.1.post1-cp38-cp38-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 e1ef2a45143e820766c2600420a7125929939127f830da353afd04f2348f9147
MD5 18de71545b9613b459e34ef68c5c532a
BLAKE2b-256 fa83f708e2195ecfbecf7e830af1336e9086155df385a8be12f71dac7767dddd

See more details on using hashes here.

File details

Details for the file ray-1.0.1.post1-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: ray-1.0.1.post1-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 14.9 MB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.3

File hashes

Hashes for ray-1.0.1.post1-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 581adb37ee0f88b662f672b06a2ac91b5736087b38b84d64f2191a90a6e177c7
MD5 788c75aeb386356fe30bb5e76d34cc8b
BLAKE2b-256 6459a44a351857faba9c815c3afaca7df356155b77c33da604987c8fde1422ab

See more details on using hashes here.

File details

Details for the file ray-1.0.1.post1-cp37-cp37m-manylinux1_x86_64.whl.

File metadata

  • Download URL: ray-1.0.1.post1-cp37-cp37m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 23.1 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.3

File hashes

Hashes for ray-1.0.1.post1-cp37-cp37m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 a4f7a7385848d8e747e6b5bb9d7eaf469854ec341dfe81c88858703af7d16516
MD5 88405cbb405f061cdfa1986c2158bbb9
BLAKE2b-256 d08bc62b46530bab576634277e798dfc6aa1ed441a36f3a16cfd259bee27e8f3

See more details on using hashes here.

File details

Details for the file ray-1.0.1.post1-cp37-cp37m-macosx_10_13_intel.whl.

File metadata

  • Download URL: ray-1.0.1.post1-cp37-cp37m-macosx_10_13_intel.whl
  • Upload date:
  • Size: 48.3 MB
  • Tags: CPython 3.7m, macOS 10.13+ Intel (x86-64, i386)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.3

File hashes

Hashes for ray-1.0.1.post1-cp37-cp37m-macosx_10_13_intel.whl
Algorithm Hash digest
SHA256 f482528b4c79fe4eb255ccd802a00902ce711016ee3bd32a7d9d89ca9cf8b42d
MD5 06796ac498e89e51cbdd6769c45b8711
BLAKE2b-256 b9fb14afe04a4fa92a1ea460f9d12f7ec491666e497133411aef9b9043aa5a76

See more details on using hashes here.

File details

Details for the file ray-1.0.1.post1-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: ray-1.0.1.post1-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 14.9 MB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.3

File hashes

Hashes for ray-1.0.1.post1-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 61fddc7bd37b17104f0bb88a772c157bb14e2dfc82a9a79a9173b784fa689f6f
MD5 36cef934da320a4a7615046eee83b01d
BLAKE2b-256 efde39eeac43f6d656a1c2b0052f0f2af77070945e9908818f172ba1e99d03ef

See more details on using hashes here.

File details

Details for the file ray-1.0.1.post1-cp36-cp36m-manylinux1_x86_64.whl.

File metadata

  • Download URL: ray-1.0.1.post1-cp36-cp36m-manylinux1_x86_64.whl
  • Upload date:
  • Size: 23.1 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.3

File hashes

Hashes for ray-1.0.1.post1-cp36-cp36m-manylinux1_x86_64.whl
Algorithm Hash digest
SHA256 1f2505ada0070054378f91fd6092d4cf9f93f893865dcc7a81fa8a34fed0eb5c
MD5 f51d7e66c82c069f4fb85a26c34c9db6
BLAKE2b-256 128744476ad712acc1f7957cbf88d307d4a0283a740487cf85d710d0211d0135

See more details on using hashes here.

File details

Details for the file ray-1.0.1.post1-cp36-cp36m-macosx_10_13_intel.whl.

File metadata

  • Download URL: ray-1.0.1.post1-cp36-cp36m-macosx_10_13_intel.whl
  • Upload date:
  • Size: 48.3 MB
  • Tags: CPython 3.6m, macOS 10.13+ Intel (x86-64, i386)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.3

File hashes

Hashes for ray-1.0.1.post1-cp36-cp36m-macosx_10_13_intel.whl
Algorithm Hash digest
SHA256 b8fbab803e7b35b1924b80cbf93f6bd680dfa08954dcf24fd23079d5f43561e2
MD5 d918b43ce322bb2c570bc778b5bcef88
BLAKE2b-256 5dce7c43d6239f4230f51d1dba2ea69554cc5754bf591dd474a4e2821d290fbe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page