Skip to main content

Amazon S3 filesystem for PyFilesystem2, forked from https://github.com/PyFilesystem/s3fs

Project description

Forked from https://github.com/PyFilesystem/s3fs/

  • to be able to set the endpoint in the url

  • to be able to skip to directory creation/removal to improve performance

S3FS

S3FS is a PyFilesystem interface to Amazon S3 cloud storage.

As a PyFilesystem concrete class, S3FS allows you to work with S3 in the same way as any other supported filesystem.

Installing

You can install S3FS from pip as follows:

pip install fs-s3fs-forked

Opening a S3FS

Open an S3FS by explicitly using the constructor:

from fs_s3fs import S3FS
s3fs = S3FS('mybucket')

Or with a FS URL:

from fs import open_fs
s3fs = open_fs('s3://mybucket')

Downloading Files

To download files from an S3 bucket, open a file on the S3 filesystem for reading, then write the data to a file on the local filesystem. Here’s an example that copies a file example.mov from S3 to your HD:

from fs.tools import copy_file_data
with s3fs.open('example.mov', 'rb') as remote_file:
    with open('example.mov', 'wb') as local_file:
        copy_file_data(remote_file, local_file)

Although it is preferable to use the higher-level functionality in the fs.copy module. Here’s an example:

from fs.copy import copy_file
copy_file(s3fs, 'example.mov', './', 'example.mov')

Uploading Files

You can upload files in the same way. Simply copy a file from a source filesystem to the S3 filesystem. See Moving and Copying for more information.

ExtraArgs

S3 objects have additional properties, beyond a traditional filesystem. These options can be set using the upload_args and download_args properties. which are handed to upload and download methods, as appropriate, for the lifetime of the filesystem instance.

For example, to set the cache-control header of all objects uploaded to a bucket:

import fs, fs.mirror
s3fs = S3FS('example', upload_args={"CacheControl": "max-age=2592000", "ACL": "public-read"})
fs.mirror.mirror('/path/to/mirror', s3fs)

see the Boto3 docs for more information.

acl and cache_control are exposed explicitly for convenience, and can be used in URLs. It is important to URL-Escape the cache_control value in a URL, as it may contain special characters.

import fs, fs.mirror
with open fs.open_fs('s3://example?acl=public-read&cache_control=max-age%3D2592000%2Cpublic') as s3fs
    fs.mirror.mirror('/path/to/mirror', s3fs)

S3 URLs

You can get a public URL to a file on a S3 bucket as follows:

movie_url = s3fs.geturl('example.mov')

Documentation

Releasing

  • Update version number in _version.py

  • install build dependencies: pip install wheel twine

  • install pandoc

  • make release

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

fs_s3fs_forked-1.1.4.tar.gz (11.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

fs_s3fs_forked-1.1.4-py2.py3-none-any.whl (11.0 kB view details)

Uploaded Python 2Python 3

File details

Details for the file fs_s3fs_forked-1.1.4.tar.gz.

File metadata

  • Download URL: fs_s3fs_forked-1.1.4.tar.gz
  • Upload date:
  • Size: 11.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.10.12

File hashes

Hashes for fs_s3fs_forked-1.1.4.tar.gz
Algorithm Hash digest
SHA256 c22bd70ea4eff9013edbc87ec6d24269d3b3e568ca3b37b54df2a917cef145c0
MD5 fc7da0c53d99d6cf4b0560eb06e72de1
BLAKE2b-256 9c6ac2e858d961f2f3b7f1bcea2801ef60b48ed1eb12ed85de2090928d4c36b3

See more details on using hashes here.

File details

Details for the file fs_s3fs_forked-1.1.4-py2.py3-none-any.whl.

File metadata

File hashes

Hashes for fs_s3fs_forked-1.1.4-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 f9a1333ba6c3fe0357d34511a82e97f406de7f326f454f61d85388f513adf34a
MD5 ecc220feb645a644dfb08ec7fdac6898
BLAKE2b-256 e2024753e220a7add06f8365c340c77acb4bec80e79c43c548d2e54b7edeae8e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page