Skip to main content

Amazon S3 filesystem for PyFilesystem2

Project description

S3FS is a PyFilesystem interface to Amazon S3 cloud storage.

As a PyFilesystem concrete class, S3FS allows you to work with S3 in the same way as any other supported filesystem.

Installing

You can install S3FS from pip as follows:

pip install fs-s3fs

Opening a S3FS

Open an S3FS by explicitly using the constructor:

from fs_s3fs import S3FS
s3fs = S3FS('mybucket')

Or with a FS URL:

from fs import open_fs
s3fs = open_fs('s3://mybucket')

Downloading Files

To download files from an S3 bucket, open a file on the S3 filesystem for reading, then write the data to a file on the local filesystem. Here’s an example that copies a file example.mov from S3 to your HD:

from fs.tools import copy_file_data
with s3fs.open('example.mov', 'rb') as remote_file:
    with open('example.mov', 'wb') as local_file:
        copy_file_data(remote_file, local_file)

Although it is preferable to use the higher-level functionality in the fs.copy module. Here’s an example:

from fs.copy import copy_file
copy_file(s3fs, 'example.mov', './', 'example.mov')

Uploading Files

You can upload files in the same way. Simply copy a file from a source filesystem to the S3 filesystem. See Moving and Copying for more information.

ExtraArgs

S3 objects have additional properties, beyond a traditional filesystem. These options can be set using the upload_args and download_args properties. which are handed to upload and download methods, as appropriate, for the lifetime of the filesystem instance.

For example, to set the cache-control header of all objects uploaded to a bucket:

import fs, fs.mirror
s3fs = S3FS('example', upload_args={"CacheControl": "max-age=2592000", "ACL": "public-read"})
fs.mirror.mirror('/path/to/mirror', s3fs)

see the Boto3 docs for more information.

acl and cache_control are exposed explicitly for convenience, and can be used in URLs. It is important to URL-Escape the cache_control value in a URL, as it may contain special characters.

import fs, fs.mirror
with open fs.open_fs('s3://example?acl=public-read&cache_control=max-age%3D2592000%2Cpublic') as s3fs
    fs.mirror.mirror('/path/to/mirror', s3fs)

S3 URLs

You can get a public URL to a file on a S3 bucket as follows:

movie_url = s3fs.geturl('example.mov')

Documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fs-s3fs-1.1.1.tar.gz (10.4 kB view details)

Uploaded Source

Built Distribution

fs_s3fs-1.1.1-py2.py3-none-any.whl (9.7 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file fs-s3fs-1.1.1.tar.gz.

File metadata

  • Download URL: fs-s3fs-1.1.1.tar.gz
  • Upload date:
  • Size: 10.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Python-urllib/3.7

File hashes

Hashes for fs-s3fs-1.1.1.tar.gz
Algorithm Hash digest
SHA256 b57f8c7664460ff7b451b4b44ca2ea9623a374d74e1284c2d5e6df499dc7976c
MD5 7c329e91485fe2cdc001edf961da9d76
BLAKE2b-256 bef1b641963f8694cdf7c3daed0276fef361c2b51717bf5f3e8a89b63c8ba237

See more details on using hashes here.

File details

Details for the file fs_s3fs-1.1.1-py2.py3-none-any.whl.

File metadata

  • Download URL: fs_s3fs-1.1.1-py2.py3-none-any.whl
  • Upload date:
  • Size: 9.7 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Python-urllib/3.7

File hashes

Hashes for fs_s3fs-1.1.1-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 9ba160eaa93390cc5992a857675666cb2fbb3721b872474dfdc659a715c39280
MD5 218e5e2d4f69be4642738e91afbfc209
BLAKE2b-256 14719b36a6dbd28386e2028e4ab9aac9c30874fc9c74d6b8fa0c8c2806548311

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page