Skip to main content

Wrap data sources like files with asyncio python

Project description

Python Anyncronous File System

There are a broad variety of data sources in the current software: sql databases, files, no-sql databases, cloud storages, ... and every one has its own methods and ways to accesing the data.

A good software architecture should not be have a strong dependency to these concrete methods and this is the purpose of this library: Abstract the acccess to data sources from a software application

There already are som libraries for this purpose in python (like filesystem ) but it is not fully oriented to asyncronous programming. It has taken as reference the library aiofiles as pattern to define the system

Usage

There are only two basic classes which wrapp any data source:

  1. FileLike, which is an object with basic file methods:
    1. Binary read method to access to data bytes of the file.
    2. Asyncronous binary write method for writing bytes to the file.
    3. Context management: read and write shall be ran within a context for assuring the proper closing and handling of file inside filesystem.
  2. FileLikeSystem, which is a file system for the files. They have the following methods:
    1. open is the main method because it is the way to create and access to FileLike object. It is very important to have clear the future handling of the file:
      1. If the file will be used to only read, the mode should be r. This the default mode.
      2. If the file exist and will be written, the mode should be r+ if you will have a reading + writing process or w if it only will be written.
  3. rm to remove one or some files
  4. ls to list the filenames accesibles in the file system.

There are the following possible exception:

  • BlockingIOError if two clients are writing at the same time
  • FileNotFound if the file does not exist in the file system

Installation

Depending of what data source will be used, it is necessary to define extras:

  1. If you are using a Operating File System it is not necessary any extras
  2. If you are using Redis Data System, you'll need to add redis extra. The directory structure will be stored within the name of the variable. Ex. directory/path/filename.bin will be mapped as directory:path:filename.bin.
  3. If you are using Azure blobs, you will need to add extra azure. The directory structure will go directly to the blob name. The file system is mapped to an unique blob container.

An example of manual installation for azure environment could be: pip install aiofs[azure]

TODOS

  • It is intended to access to random access to files.
  • Amplify the file system methods to a better file handling
  • Add new data sources

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aiofs-0.2.5.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

aiofs-0.2.5-py3-none-any.whl (8.0 kB view details)

Uploaded Python 3

File details

Details for the file aiofs-0.2.5.tar.gz.

File metadata

  • Download URL: aiofs-0.2.5.tar.gz
  • Upload date:
  • Size: 6.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.13.3 Linux/6.14.6-arch1-1

File hashes

Hashes for aiofs-0.2.5.tar.gz
Algorithm Hash digest
SHA256 3050dca870fb87d8031d2961c9c4bf4c7500601f24186a7a3aae1d3ede4a258f
MD5 aabca00e61795fdf998add1d3d27cc50
BLAKE2b-256 b8c42ba1a932056f34007d24256a90994774e9fb6488c2498078878a49e98e52

See more details on using hashes here.

File details

Details for the file aiofs-0.2.5-py3-none-any.whl.

File metadata

  • Download URL: aiofs-0.2.5-py3-none-any.whl
  • Upload date:
  • Size: 8.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.3 CPython/3.13.3 Linux/6.14.6-arch1-1

File hashes

Hashes for aiofs-0.2.5-py3-none-any.whl
Algorithm Hash digest
SHA256 bb8633dbb1b21b9bef3a56e9df1ce1aa0dc4d80b6990b409e94850f9b7b4c918
MD5 a886b58fc7c8693d55aa4d558a2a0c31
BLAKE2b-256 17f22e66c867ed793f3b1f07103495934167c46d4f9d6700c948ac4e0e504dfe

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page