Skip to main content

Train networks on large data using attention sampling.

Project description

This repository provides a python library to accelerate the training and inference of neural networks on large data. This code is the reference implementation of the methods described in our ICML 2019 publication “Processing Megapixel Images with Deep Attention-Sampling Models”.


You can find examples of how to use our library in the provided scripts or a very concise one below.

# Keras imports

from ats.core import attention_sampling
from ats.utils.layers import SampleSoftmax
from ats.utils.regularizers import multinomial_entropy

# Create our two inputs.
# Note that x_low could also be an input if we have access to a precomputed
# downsampled image.
x_high = Input(shape=(H, W, C))
x_low = AveragePooling2D(pool_size=(10,))(x_high)

# Create our attention model
attention = Sequential([

# Create our feature extractor per patch, we assume that it returns a
# vector per patch.
feature = Sequential([

features, attention, patches = attention_sampling(
    patch_size=(32, 32),
)([x_low, x_high])

y = Dense(output_size, activation="softmax")(features)

model = Model(inputs=x_high, outputs=y)

Dependencies & Installation

To install the library just run pip install attention-sampling. If you want to extend our code clone the repository and install it in development mode.

The dependencies of attention-sampling are

  • TensorFlow
  • C++ tool chain
  • CUDA (optional)


There exists a dedicated documentation site but you are also encouraged to read the source code <> and the scripts to get an idea of how the library should be used and extended.


If you found this work influential or helpful in your research in any way, we would appreciate if you cited us.

    title={Processing Megapixel Images with Deep Attention-Sampling Models},
    author={Katharopoulos, A. and Fleuret, F.},
    booktitle={Proceedings of the International Conference on Machine Learning (ICML)},

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for attention-sampling, version 0.2
Filename, size File type Python version Upload date Hashes
Filename, size attention-sampling-0.2.tar.gz (24.6 kB) File type Source Python version None Upload date Hashes View

Supported by

Pingdom Pingdom Monitoring Google Google Object Storage and Download Analytics Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN DigiCert DigiCert EV certificate StatusPage StatusPage Status page