Skip to main content

Amazon EFS (amazon-efs) allows programmatically manipulate EFS data (create, read, delete, list files) from any machine.

Project description

Amazon EFS (amazon-efs)

Amazon EFS (amazon-efs) allows programmatically manipulate EFS data (create, read, delete, list files) from any machine.

Prerequisites

  • python
  • pip
  • boto3
  • AWS Account
  • AWS Credentials

Install

pip install amazon-efs

Warning

EFS should have at least one mount target in a Private subnet

Limits

Lambda compute env

list_files, upload, download, delete actions are limited by 15 minutes execution time (AWS Lambda works under the hood)

Batch compute env

list_files, upload, download actions are not implemented yet

Basics

Supported compute environments:

  • Lambda (Default)
  • Batch

Lambda compute environment (by default):

efs = Efs('<file_system_id>')

Lambda compute environment:

efs = Efs('<file_system_id>', compute_env_name='lambda')

Batch compute environment:

The "batch_queue" option is required
efs = Efs('<file_system_id>', {
    'batch_queue': '<batch_queue>',
}, compute_env_name='batch')

Lambda compute environment

This computing environment is used for lightweight operations (lasting no more than 15 minutes).

from amazon_efs import Efs

efs_id = 'fs-0d74736bfc*******'
efs = Efs(efs_id)

# Deploying required underlying resources
efs.init()
# Actions (e.g. list_files, upload, download, delete)
files_list = efs.list_files()
# Don't forget to destroy underlying resources at the end of the session
efs.destroy()

Actions

List files

from amazon_efs import Efs

efs_id = 'fs-0d74736bfc*******'
efs = Efs(efs_id)

efs.init()

files_list = efs.list_files()
print(files_list)
files_list = efs.list_files('dir1')
print(files_list)
files_list = efs.list_files('dir1/dir2')
print(files_list)

efs.destroy()

Upload

from amazon_efs import Efs

efs_id = 'fs-0d74736bfc*******'
efs = Efs(efs_id)

efs.init()

efs.upload('file.txt')
efs.upload('file.txt', 'dir1/new_file.txt')
efs.upload('file.txt', 'dir1/dir2/new_file.txt')
efs.upload('file.txt', 'dir1/dir3/new_file.txt')
efs.upload('file.txt', 'dir2/dir3/new_file.txt')
efs.upload('file.txt', 'dir2/dir4/new_file.txt')

efs.destroy()

Download

from amazon_efs import Efs

efs_id = 'fs-0d74736bfc*******'
efs = Efs(efs_id)

efs.init()

efs.download('dir1/dir3/new_file.txt', 'file1.txt')

efs.destroy()

Delete

Delete file

from amazon_efs import Efs
    
efs_id = 'fs-0d74736bfc*******'
efs = Efs(efs_id)
    
efs.init()
    
efs.delete('dir2/dir3/new_file.txt')
    
efs.destroy()

Delete folder

from amazon_efs import Efs
    
efs_id = 'fs-0d74736bfc*******'
efs = Efs(efs_id)
    
efs.init()
    
efs.delete('dir1/dir2/*')
efs.delete('dir1/*')
    
efs.destroy()

Batch compute environment

This computing environment is used for heavy operations (lasting more than 15 minutes).

Actions

Delete

The "batch_queue" option is required

Delete file

from amazon_efs import Efs
    
efs_id = 'fs-0d74736bfc*******'
batch_queue = '<batch_queue>'
efs = Efs(efs_id, {
  'batch_queue': batch_queue,
}, compute_env_name='batch')
    
efs.init()
    
efs.delete('dir2/dir3/new_file.txt')
    
efs.destroy()

Delete folder

from amazon_efs import Efs
    
efs_id = 'fs-0d74736bfc*******'
batch_queue = '<batch_queue>'
efs = Efs(efs_id, {
  'batch_queue': batch_queue,
}, compute_env_name='batch')
    
efs.init()
    
efs.delete('dir1/dir2/*')
efs.delete('dir1/*')
    
efs.destroy()

State

You can destroy underlying infrastructure even after destroying EFS object from RAM if you saved the state

from amazon_efs import Efs

efs_id = 'fs-0d74736bfc*******'
efs = Efs(efs_id)

state = efs.init()

# Destroy object
del efs

efs = Efs(efs_id, { 'state': state })

files_list = efs.list_files()
print(files_list)

efs.destroy()

Tags

You can add custom tags to underlying resources

from amazon_efs import Efs

efs_id = 'fs-0d74736bfc*******'
efs = Efs(efs_id, {
    'tags': {
        'k1': 'v1',
        'k2': 'v2'
    }
})

efs.init()

files_list = efs.list_files()
print(files_list)

efs.destroy()

Logging

from amazon_efs import Efs
import logging

fs_id = 'fs-0d74736bfc*******'

logger = logging.getLogger()
logging.basicConfig(level=logging.ERROR, format='%(asctime)s: %(levelname)s: %(message)s')

efs = Efs(fs_id, logger=logger)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

amazon-efs-0.3.0.tar.gz (11.7 kB view details)

Uploaded Source

Built Distribution

amazon_efs-0.3.0-py3-none-any.whl (13.6 kB view details)

Uploaded Python 3

File details

Details for the file amazon-efs-0.3.0.tar.gz.

File metadata

  • Download URL: amazon-efs-0.3.0.tar.gz
  • Upload date:
  • Size: 11.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.26.0 requests-toolbelt/0.9.1 urllib3/1.26.6 tqdm/4.63.1 importlib-metadata/4.10.0 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.12

File hashes

Hashes for amazon-efs-0.3.0.tar.gz
Algorithm Hash digest
SHA256 d022a21e7f64acda11e94158b2f1f99c61d3d865b3a2255b782bd5eddf017abb
MD5 b26e8d554d4ca277ba0d2de2e423a092
BLAKE2b-256 57fbad41fe12bbe45f821549191a37a29ac449d8332be8bd6ef00e2b57255184

See more details on using hashes here.

File details

Details for the file amazon_efs-0.3.0-py3-none-any.whl.

File metadata

  • Download URL: amazon_efs-0.3.0-py3-none-any.whl
  • Upload date:
  • Size: 13.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.8.0 pkginfo/1.8.2 readme-renderer/34.0 requests/2.26.0 requests-toolbelt/0.9.1 urllib3/1.26.6 tqdm/4.63.1 importlib-metadata/4.10.0 keyring/23.5.0 rfc3986/2.0.0 colorama/0.4.4 CPython/3.9.12

File hashes

Hashes for amazon_efs-0.3.0-py3-none-any.whl
Algorithm Hash digest
SHA256 848a27fc0946560332ef12570928f8696d66e3cd7807f91f3f8cdf8240d8e395
MD5 dfb347768bdf081b1348c742f2b61338
BLAKE2b-256 748ce165bc39fc2eabd344837e50d90e51a4786b25b95debe525dea2add49225

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page