Skip to main content

Ray provides a simple, universal API for building distributed applications.

Project description

https://github.com/ray-project/ray/raw/master/doc/source/images/ray_header_logo.png https://readthedocs.org/projects/ray/badge/?version=master https://img.shields.io/badge/Ray-Join%20Slack-blue https://img.shields.io/badge/Discuss-Ask%20Questions-blue

Ray provides a simple, universal API for building distributed applications.

Ray is packaged with the following libraries for accelerating machine learning workloads:

  • Tune: Scalable Hyperparameter Tuning

  • RLlib: Scalable Reinforcement Learning

  • RaySGD: Distributed Training Wrappers

  • Ray Serve: Scalable and Programmable Serving

There are also many community integrations with Ray, including Dask, MARS, Modin, Horovod, Hugging Face, Scikit-learn, and others. Check out the full list of Ray distributed libraries here.

Install Ray with: pip install ray. For nightly wheels, see the Installation page.

Quick Start

Execute Python functions in parallel.

import ray
ray.init()

@ray.remote
def f(x):
    return x * x

futures = [f.remote(i) for i in range(4)]
print(ray.get(futures))

To use Ray’s actor model:

import ray
ray.init()

@ray.remote
class Counter(object):
    def __init__(self):
        self.n = 0

    def increment(self):
        self.n += 1

    def read(self):
        return self.n

counters = [Counter.remote() for i in range(4)]
[c.increment.remote() for c in counters]
futures = [c.read.remote() for c in counters]
print(ray.get(futures))

Ray programs can run on a single machine, and can also seamlessly scale to large clusters. To execute the above Ray script in the cloud, just download this configuration file, and run:

ray submit [CLUSTER.YAML] example.py --start

Read more about launching clusters.

Tune Quick Start

https://github.com/ray-project/ray/raw/master/doc/source/images/tune-wide.png

Tune is a library for hyperparameter tuning at any scale.

To run this example, you will need to install the following:

$ pip install "ray[tune]"

This example runs a parallel grid search to optimize an example objective function.

from ray import tune


def objective(step, alpha, beta):
    return (0.1 + alpha * step / 100)**(-1) + beta * 0.1


def training_function(config):
    # Hyperparameters
    alpha, beta = config["alpha"], config["beta"]
    for step in range(10):
        # Iterative training function - can be any arbitrary training procedure.
        intermediate_score = objective(step, alpha, beta)
        # Feed the score back back to Tune.
        tune.report(mean_loss=intermediate_score)


analysis = tune.run(
    training_function,
    config={
        "alpha": tune.grid_search([0.001, 0.01, 0.1]),
        "beta": tune.choice([1, 2, 3])
    })

print("Best config: ", analysis.get_best_config(metric="mean_loss", mode="min"))

# Get a dataframe for analyzing trial results.
df = analysis.results_df

If TensorBoard is installed, automatically visualize all trial results:

tensorboard --logdir ~/ray_results

RLlib Quick Start

https://github.com/ray-project/ray/raw/master/doc/source/images/rllib-wide.jpg

RLlib is an open-source library for reinforcement learning built on top of Ray that offers both high scalability and a unified API for a variety of applications.

pip install tensorflow  # or tensorflow-gpu
pip install "ray[rllib]"
import gym
from gym.spaces import Discrete, Box
from ray import tune

class SimpleCorridor(gym.Env):
    def __init__(self, config):
        self.end_pos = config["corridor_length"]
        self.cur_pos = 0
        self.action_space = Discrete(2)
        self.observation_space = Box(0.0, self.end_pos, shape=(1, ))

    def reset(self):
        self.cur_pos = 0
        return [self.cur_pos]

    def step(self, action):
        if action == 0 and self.cur_pos > 0:
            self.cur_pos -= 1
        elif action == 1:
            self.cur_pos += 1
        done = self.cur_pos >= self.end_pos
        return [self.cur_pos], 1 if done else 0, done, {}

tune.run(
    "PPO",
    config={
        "env": SimpleCorridor,
        "num_workers": 4,
        "env_config": {"corridor_length": 5}})

Ray Serve Quick Start

https://raw.githubusercontent.com/ray-project/ray/master/doc/source/serve/logo.svg

Ray Serve is a scalable model-serving library built on Ray. It is:

  • Framework Agnostic: Use the same toolkit to serve everything from deep learning models built with frameworks like PyTorch or Tensorflow & Keras to Scikit-Learn models or arbitrary business logic.

  • Python First: Configure your model serving with pure Python code - no more YAMLs or JSON configs.

  • Performance Oriented: Turn on batching, pipelining, and GPU acceleration to increase the throughput of your model.

  • Composition Native: Allow you to create “model pipelines” by composing multiple models together to drive a single prediction.

  • Horizontally Scalable: Serve can linearly scale as you add more machines. Enable your ML-powered service to handle growing traffic.

To run this example, you will need to install the following:

$ pip install scikit-learn
$ pip install "ray[serve]"

This example runs serves a scikit-learn gradient boosting classifier.

from ray import serve
import pickle
import requests
from sklearn.datasets import load_iris
from sklearn.ensemble import GradientBoostingClassifier

# Train model
iris_dataset = load_iris()
model = GradientBoostingClassifier()
model.fit(iris_dataset["data"], iris_dataset["target"])

# Define Ray Serve model,
class BoostingModel:
    def __init__(self):
        self.model = model
        self.label_list = iris_dataset["target_names"].tolist()

    def __call__(self, flask_request):
        payload = flask_request.json["vector"]
        print("Worker: received flask request with data", payload)

        prediction = self.model.predict([payload])[0]
        human_name = self.label_list[prediction]
        return {"result": human_name}


# Deploy model
client = serve.start()
client.create_backend("iris:v1", BoostingModel)
client.create_endpoint("iris_classifier", backend="iris:v1", route="/iris")

# Query it!
sample_request_input = {"vector": [1.2, 1.0, 1.1, 0.9]}
response = requests.get("http://localhost:8000/iris", json=sample_request_input)
print(response.text)
# Result:
# {
#  "result": "versicolor"
# }

More Information

Older documents:

Getting Involved

Project details


Release history Release notifications | RSS feed

This version

1.4.0

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

If you're not sure about the file name format, learn more about wheel file names.

ray-1.4.0-cp38-cp38-win_amd64.whl (15.7 MB view details)

Uploaded CPython 3.8Windows x86-64

ray-1.4.0-cp38-cp38-manylinux2014_x86_64.whl (49.2 MB view details)

Uploaded CPython 3.8

ray-1.4.0-cp38-cp38-macosx_10_13_x86_64.whl (50.1 MB view details)

Uploaded CPython 3.8macOS 10.13+ x86-64

ray-1.4.0-cp37-cp37m-win_amd64.whl (15.8 MB view details)

Uploaded CPython 3.7mWindows x86-64

ray-1.4.0-cp37-cp37m-manylinux2014_x86_64.whl (49.4 MB view details)

Uploaded CPython 3.7m

ray-1.4.0-cp37-cp37m-macosx_10_13_intel.whl (50.2 MB view details)

Uploaded CPython 3.7mmacOS 10.13+ Intel (x86-64, i386)

ray-1.4.0-cp36-cp36m-win_amd64.whl (15.8 MB view details)

Uploaded CPython 3.6mWindows x86-64

ray-1.4.0-cp36-cp36m-manylinux2014_x86_64.whl (49.4 MB view details)

Uploaded CPython 3.6m

ray-1.4.0-cp36-cp36m-macosx_10_13_intel.whl (50.3 MB view details)

Uploaded CPython 3.6mmacOS 10.13+ Intel (x86-64, i386)

File details

Details for the file ray-1.4.0-cp38-cp38-win_amd64.whl.

File metadata

  • Download URL: ray-1.4.0-cp38-cp38-win_amd64.whl
  • Upload date:
  • Size: 15.7 MB
  • Tags: CPython 3.8, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0-cp38-cp38-win_amd64.whl
Algorithm Hash digest
SHA256 8b6acd370c27cfd8c8663db70a73371b6d59befaccf52c2684ec69ca3c2bb554
MD5 69c16ba5d5326da91484f391d1dcd289
BLAKE2b-256 e29fbe8f834cd72402219dd487aabe8fd2bc03cbaae2a879ad663eea989b6586

See more details on using hashes here.

File details

Details for the file ray-1.4.0-cp38-cp38-manylinux2014_x86_64.whl.

File metadata

  • Download URL: ray-1.4.0-cp38-cp38-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 49.2 MB
  • Tags: CPython 3.8
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0-cp38-cp38-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 63417a9417651018ce08ef3741a7d8a702fd04d8ac07cedfdab0a42de9b760d6
MD5 dd34551ec948748d605f43948260370f
BLAKE2b-256 de701160609526851679c5f4bebdbdda05ffe65a5d1cf5d2bffdcf06e213879e

See more details on using hashes here.

File details

Details for the file ray-1.4.0-cp38-cp38-macosx_10_13_x86_64.whl.

File metadata

  • Download URL: ray-1.4.0-cp38-cp38-macosx_10_13_x86_64.whl
  • Upload date:
  • Size: 50.1 MB
  • Tags: CPython 3.8, macOS 10.13+ x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0-cp38-cp38-macosx_10_13_x86_64.whl
Algorithm Hash digest
SHA256 44d6d9e1e1defda73c58c2ec111c9c179f647559cc3f3775c694f4a7de0bdd30
MD5 e3599971e13703244941153b71497861
BLAKE2b-256 5dea36f37a4c3cc8502968e3da8883672463674ec02b995a5555e1d5274c26a1

See more details on using hashes here.

File details

Details for the file ray-1.4.0-cp37-cp37m-win_amd64.whl.

File metadata

  • Download URL: ray-1.4.0-cp37-cp37m-win_amd64.whl
  • Upload date:
  • Size: 15.8 MB
  • Tags: CPython 3.7m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0-cp37-cp37m-win_amd64.whl
Algorithm Hash digest
SHA256 cf5ea375f0f5e47b89398c10a6bd857f4ee635414d781312439c878dbc097e54
MD5 fa51065b7f67bdb63d433239cf5d51dd
BLAKE2b-256 323243adc33523e82873c8b5061c2c27db2c30606f071f5a4fb544c9014aadee

See more details on using hashes here.

File details

Details for the file ray-1.4.0-cp37-cp37m-manylinux2014_x86_64.whl.

File metadata

  • Download URL: ray-1.4.0-cp37-cp37m-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 49.4 MB
  • Tags: CPython 3.7m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0-cp37-cp37m-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 4ceaf52224c090f2a1ddcae9daa61fc08b7473f7e3e3a47c7954491fe0c05071
MD5 59da6e84e5e085cb5fe9f4ae8f8a1bcb
BLAKE2b-256 5c7a34d018cc6a72b0dd42c6d9fad5fb34c8cc2140a785450bed201d4b2f3427

See more details on using hashes here.

File details

Details for the file ray-1.4.0-cp37-cp37m-macosx_10_13_intel.whl.

File metadata

  • Download URL: ray-1.4.0-cp37-cp37m-macosx_10_13_intel.whl
  • Upload date:
  • Size: 50.2 MB
  • Tags: CPython 3.7m, macOS 10.13+ Intel (x86-64, i386)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0-cp37-cp37m-macosx_10_13_intel.whl
Algorithm Hash digest
SHA256 a8696e6aa6b14ab1e0282cb8e3a84c4496ea47a4008105f731913b3364f5a1a0
MD5 c172433c484b3e344ef57858f0dd4a45
BLAKE2b-256 6d229d4fed33a9ec128b83f55498c3e47c64952606787643007b9f2582348190

See more details on using hashes here.

File details

Details for the file ray-1.4.0-cp36-cp36m-win_amd64.whl.

File metadata

  • Download URL: ray-1.4.0-cp36-cp36m-win_amd64.whl
  • Upload date:
  • Size: 15.8 MB
  • Tags: CPython 3.6m, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0-cp36-cp36m-win_amd64.whl
Algorithm Hash digest
SHA256 2b639565356e87b22e9158a1ef91fb492d9b7ff89f7ca506036965f5c4f71b90
MD5 763ac6b576fcdaba43f1018971f925c0
BLAKE2b-256 73d5b3cc14ae18412ceec7e7c0fe16c4ac3ac2498eab33a2b2b7d1638e84c62c

See more details on using hashes here.

File details

Details for the file ray-1.4.0-cp36-cp36m-manylinux2014_x86_64.whl.

File metadata

  • Download URL: ray-1.4.0-cp36-cp36m-manylinux2014_x86_64.whl
  • Upload date:
  • Size: 49.4 MB
  • Tags: CPython 3.6m
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0-cp36-cp36m-manylinux2014_x86_64.whl
Algorithm Hash digest
SHA256 ff1e1aa6921c78ffa8e0c0eba49c8207e16a1f79bff1250c03fd2890f91108c4
MD5 f647b7f649be3be0dfde797ab2a7d886
BLAKE2b-256 a9ce60cdb0aeedce803e4e27e90212195e24fb36498fcde11725dc439d5624bf

See more details on using hashes here.

File details

Details for the file ray-1.4.0-cp36-cp36m-macosx_10_13_intel.whl.

File metadata

  • Download URL: ray-1.4.0-cp36-cp36m-macosx_10_13_intel.whl
  • Upload date:
  • Size: 50.3 MB
  • Tags: CPython 3.6m, macOS 10.13+ Intel (x86-64, i386)
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.9

File hashes

Hashes for ray-1.4.0-cp36-cp36m-macosx_10_13_intel.whl
Algorithm Hash digest
SHA256 34532ecd8867751f880739324ba698d9ab7aa5bd7ff41677b879aaa6099b4276
MD5 30c9253b9e98521f26cf43907161adde
BLAKE2b-256 b521099900754f7a74daf696bb95bebe125715542f8ed22017731db07458dfa9

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page