Multi-backend Keras.
Project description
Keras 3: Deep Learning for Humans
Keras 3 is a multi-backend deep learning framework, with support for JAX, TensorFlow, and PyTorch. Effortlessly build and train models for computer vision, natural language processing, audio processing, timeseries forecasting, recommender systems, etc.
- Accelerated model development: Ship deep learning solutions faster thanks to the high-level UX of Keras and the availability of easy-to-debug runtimes like PyTorch or JAX eager execution.
- State-of-the-art performance: By picking the backend that is the fastest for your model architecture (often JAX!), leverage speedups ranging from 20% to 350% compared to other frameworks. Benchmark here.
- Datacenter-scale training: Scale confidently from your laptop to large clusters of GPUs or TPUs.
Join nearly three million developers, from burgeoning startups to global enterprises, in harnessing the power of Keras 3.
Installation
Install with pip
Keras 3 is available on PyPI as keras
. Note that Keras 2 remains available as the tf-keras
package.
- Install
keras
:
pip install keras --upgrade
- Install backend package(s).
To use keras
, you should also install the backend of choice: tensorflow
, jax
, or torch
.
Note that tensorflow
is required for using certain Keras 3 features: certain preprocessing layers
as well as tf.data
pipelines.
Local installation
Minimal installation
Keras 3 is compatible with Linux and MacOS systems. For Windows users, we recommend using WSL2 to run Keras. To install a local development version:
- Install dependencies:
pip install -r requirements.txt
- Run installation command from the root directory.
python pip_build.py --install
- Run API generation script when creating PRs that update
keras_export
public APIs:
./shell/api_gen.sh
Adding GPU support
The requirements.txt
file will install a CPU-only version of TensorFlow, JAX, and PyTorch. For GPU support, we also
provide a separate requirements-{backend}-cuda.txt
for TensorFlow, JAX, and PyTorch. These install all CUDA
dependencies via pip
and expect a NVIDIA driver to be pre-installed. We recommend a clean python environment for each
backend to avoid CUDA version mismatches. As an example, here is how to create a Jax GPU environment with conda
:
conda create -y -n keras-jax python=3.10
conda activate keras-jax
pip install -r requirements-jax-cuda.txt
python pip_build.py --install
Configuring your backend
You can export the environment variable KERAS_BACKEND
or you can edit your local config file at ~/.keras/keras.json
to configure your backend. Available backend options are: "tensorflow"
, "jax"
, "torch"
. Example:
export KERAS_BACKEND="jax"
In Colab, you can do:
import os
os.environ["KERAS_BACKEND"] = "jax"
import keras
Note: The backend must be configured before importing keras
, and the backend cannot be changed after
the package has been imported.
Backwards compatibility
Keras 3 is intended to work as a drop-in replacement for tf.keras
(when using the TensorFlow backend). Just take your
existing tf.keras
code, make sure that your calls to model.save()
are using the up-to-date .keras
format, and you're
done.
If your tf.keras
model does not include custom components, you can start running it on top of JAX or PyTorch immediately.
If it does include custom components (e.g. custom layers or a custom train_step()
), it is usually possible to convert it
to a backend-agnostic implementation in just a few minutes.
In addition, Keras models can consume datasets in any format, regardless of the backend you're using:
you can train your models with your existing tf.data.Dataset
pipelines or PyTorch DataLoaders
.
Why use Keras 3?
- Run your high-level Keras workflows on top of any framework -- benefiting at will from the advantages of each framework, e.g. the scalability and performance of JAX or the production ecosystem options of TensorFlow.
- Write custom components (e.g. layers, models, metrics) that you can use in low-level workflows in any framework.
- You can take a Keras model and train it in a training loop written from scratch in native TF, JAX, or PyTorch.
- You can take a Keras model and use it as part of a PyTorch-native
Module
or as part of a JAX-native model function.
- Make your ML code future-proof by avoiding framework lock-in.
- As a PyTorch user: get access to power and usability of Keras, at last!
- As a JAX user: get access to a fully-featured, battle-tested, well-documented modeling and training library.
Read more in the Keras 3 release announcement.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file keras_nightly-3.4.1.dev2024072103.tar.gz
.
File metadata
- Download URL: keras_nightly-3.4.1.dev2024072103.tar.gz
- Upload date:
- Size: 849.0 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | d03806218eb248676f07587759d6b7b07d80d5bfc41177f672362ffad4d260c2 |
|
MD5 | 6343582d569ff0aabaa81050aad7fe06 |
|
BLAKE2b-256 | 426f8b041bb44cfd4b7029fe13717070b44191539ec5d959e018fc00203c62d0 |
File details
Details for the file keras_nightly-3.4.1.dev2024072103-py3-none-any.whl
.
File metadata
- Download URL: keras_nightly-3.4.1.dev2024072103-py3-none-any.whl
- Upload date:
- Size: 1.1 MB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.12.4
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | ecd96f2d38a1a1b2f800b5fb2529c8b16affacac44ffdc09f2627a2063e373ca |
|
MD5 | 66c7a423d53dd43c5b02e7e33bd4502e |
|
BLAKE2b-256 | b67a91c2613db0c9b812d8489a3cbe5445cdf4cb6ce14034cabc132b3f14b53d |