Skip to main content

Amazon S3 filesystem for PyFilesystem2

Project description

S3FS is a PyFilesystem interface to Amazon S3 cloud storage.

As a PyFilesystem concrete class, S3FS allows you to work with S3 in the same way as any other supported filesystem.

Installing

You can install S3FS from pip as follows:

pip install fs-s3fs

Opening a S3FS

Open an S3FS by explicitly using the constructor:

from fs_s3fs import S3FS
s3fs = S3FS('mybucket')

Or with a FS URL:

from fs import open_fs
s3fs = open_fs('s3://mybucket')

Downloading Files

To download files from an S3 bucket, open a file on the S3 filesystem for reading, then write the data to a file on the local filesystem. Here’s an example that copies a file example.mov from S3 to your HD:

from fs.tools import copy_file_data
with s3fs.open('example.mov', 'rb') as remote_file:
    with open('example.mov', 'wb') as local_file:
        copy_file_data(remote_file, local_file)

Although it is preferable to use the higher-level functionality in the fs.copy module. Here’s an example:

from fs.copy import copy_file
copy_file(s3fs, 'example.mov', './', 'example.mov')

Uploading Files

You can upload files in the same way. Simply copy a file from a source filesystem to the S3 filesystem. See Moving and Copying for more information.

ExtraArgs

S3 objects have additional properties, beyond a traditional filesystem. These options can be set using the upload_args and download_args properties. which are handed to upload and download methods, as appropriate, for the lifetime of the filesystem instance.

For example, to set the cache-control header of all objects uploaded to a bucket:

import fs, fs.mirror
s3fs = S3FS('example', upload_args={"CacheControl": "max-age=2592000", "ACL": "public-read"})
fs.mirror.mirror('/path/to/mirror', s3fs)

see the Boto3 docs for more information.

acl and cache_control are exposed explicitly for convenience, and can be used in URLs. It is important to URL-Escape the cache_control value in a URL, as it may contain special characters.

import fs, fs.mirror
with open fs.open_fs('s3://example?acl=public-read&cache_control=max-age%3D2592000%2Cpublic') as s3fs
    fs.mirror.mirror('/path/to/mirror', s3fs)

S3 URLs

You can get a public URL to a file on a S3 bucket as follows:

movie_url = s3fs.geturl('example.mov')

Documentation

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fs-s3fs-1.0.0.tar.gz (10.5 kB view details)

Uploaded Source

Built Distribution

fs_s3fs-1.0.0-py2.py3-none-any.whl (12.0 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file fs-s3fs-1.0.0.tar.gz.

File metadata

  • Download URL: fs-s3fs-1.0.0.tar.gz
  • Upload date:
  • Size: 10.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Python-urllib/2.7

File hashes

Hashes for fs-s3fs-1.0.0.tar.gz
Algorithm Hash digest
SHA256 5351ed1d2339be0823a43c602acea8f8ff2f84fd70d0b4f3a0856aa2ff31fbb3
MD5 7ed912bfbc2befbccd799fb8f585de9a
BLAKE2b-256 12639c464b2e7bc8dd7f93c1b2afc3f1e2d5bc1133a520e8ba9f31f7902d716c

See more details on using hashes here.

File details

Details for the file fs_s3fs-1.0.0-py2.py3-none-any.whl.

File metadata

  • Download URL: fs_s3fs-1.0.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 12.0 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: Python-urllib/2.7

File hashes

Hashes for fs_s3fs-1.0.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 0de20afdc2a99878d398212be831951eaaa229ffca03c01d535ba2abc77a26fd
MD5 c97365462799d2c2b980fd30b0b0d2df
BLAKE2b-256 52f5f7e822de587866899dff2b0f5228c250819bd44592f5812b0d88b154ece3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page