Skip to main content

Command Line utility for s3 local caching.

Project description

s3local

PyPI version tests

Cache the object in s3 to localhost.

Create a cache corresponding to s3 and automatically create a path for localhost and return it.

Once downloaded files remain in localhost as cache, the second migration download will be skipped

works on python3.6 or higher

Settings

aws auth support following.

  • environment variables
  • profile(use --aws-profile option.)
  • instance profile

Examples

download object and list object

$ s3local download -u s3://mybucket/artifacts/ --debug
2021-05-14 11:27:13,367 DEBUG - Copying: s3://mybucket/artifacts/main.log > /Users/hiroshi.toyama/.s3local/s3/mybucket/artifacts/main.log
2021-05-14 11:27:13,367 DEBUG - Copying: s3://mybucket/artifacts/main2.log > /Users/hiroshi.toyama/.s3local/s3/mybucket/artifacts/main2.log
2021-05-14 11:27:13,367 DEBUG - Copying: s3://mybucket/artifacts/main3.log > /Users/hiroshi.toyama/.s3local/s3/mybucket/artifacts/main3.log

# next download is skip
$ s3local download -u s3://mybucket/artifacts/ --debug
2021-05-14 14:08:02,970 DEBUG - skip already exists in local: s3://mybucket/artifacts/main.log
2021-05-14 14:08:02,970 DEBUG - skip already exists in local: s3://mybucket/artifacts/main2.log
2021-05-14 14:08:02,970 DEBUG - skip already exists in local: s3://mybucket/artifacts/main3.log

# overwrite download. (not skip)
$ s3local download -u s3://mybucket/artifacts/ --debug --no-skip-exist
2021-05-14 11:27:13,367 DEBUG - Copying: s3://mybucket/artifacts/main.log > /Users/hiroshi.toyama/.s3local/s3/mybucket/artifacts/main.log
2021-05-14 11:27:13,367 DEBUG - Copying: s3://mybucket/artifacts/main2.log > /Users/hiroshi.toyama/.s3local/s3/mybucket/artifacts/main2.log
2021-05-14 11:27:13,367 DEBUG - Copying: s3://mybucket/artifacts/main3.log > /Users/hiroshi.toyama/.s3local/s3/mybucket/artifacts/main3.log

By default $HOME/.s3local is the root directory.

The format of path in local is as follows:

$HOME/.s3local/s3/${bucket}/${key}

You can change root by setting an environment variable S3LOCAL_ROOT.

$ s3local list-local -u s3://mybucket/artifacts/
/Users/hiroshi.toyama/.s3local/s3/mybucket/artifacts/main.log
/Users/hiroshi.toyama/.s3local/s3/mybucket/artifacts/main2.log
/Users/hiroshi.toyama/.s3local/s3/mybucket/artifacts/main3.log

upload object

$ s3local upload -s tox.ini -u s3://mybucket/test/
2023-05-31 10:44:08,474 INFO - Copying to s3: tox.ini => s3://mybucket/test/tox.ini

Python API

download

from s3local import Downloader

s3local = Downloader("s3://mybucket/artifacts/")
list = s3local.list_local_path(download=True)
print(list)
#=> [
#     "/Users/hiroshi.toyama/.s3local/s3/mybucket/artifacts/main.log",
#     "/Users/hiroshi.toyama/.s3local/s3/mybucket/artifacts/main2.log",
#     "/Users/hiroshi.toyama/.s3local/s3/mybucket/artifacts/main3.log",
# ]

upload

from s3local import Uploader

s3local = Uploader("s3://mybucket/artifacts/")

uploader.upload("output/hoge.txt")
#=> s3://mybucket/artifacts/hoge.txt

uploader.upload("output")
#=> s3://mybucket/artifacts/output/hoge.txt

Installation

pip install s3local

CI

install test package

$ ./scripts/ci.sh install

test

$ ./scripts/ci.sh run-test

flake8 and black and pytest.

release pypi

$ ./scripts/ci.sh release

git tag and pypi release.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

s3local-0.4.2.tar.gz (7.4 kB view details)

Uploaded Source

File details

Details for the file s3local-0.4.2.tar.gz.

File metadata

  • Download URL: s3local-0.4.2.tar.gz
  • Upload date:
  • Size: 7.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.10.8

File hashes

Hashes for s3local-0.4.2.tar.gz
Algorithm Hash digest
SHA256 162ccc329e575023aba33a8dadadbca73a154c6c872dfcd19c69ef42a0b4e6f7
MD5 337d51dfa676a9db2de50e672a97e3ab
BLAKE2b-256 c6f93aad74633ee19f7b1433e5fc63924729a447dbce133d0b68d169f8daba0b

See more details on using hashes here.

Provenance

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page