Skip to main content

Python Client for Google Cloud Storage

Project description

This is a shared codebase for gcloud-rest-storage and gcloud-rest-storage

Latest PyPI Version (gcloud-rest-storage) Python Version Support (gcloud-rest-storage) Python Version Support (gcloud-rest-storage)

Installation

$ pip install --upgrade gcloud-{aio,rest}-storage

Usage

To upload a file, you might do something like the following:

import aiofiles
import aiohttp
from gcloud.rest.storage import Storage


async with aiohttp.ClientSession() as session:
    client = Storage(session=session)

    async with aiofiles.open('/path/to/my/file', mode="r") as f:
        contents = await f.read()
        status = await client.upload(
            'my-bucket-name',
            'path/to/gcs/folder',
            output,
        )
        print(status)

Note that there are multiple ways to accomplish the above, ie,. by making use of the Bucket and Blob convenience classes if that better fits your use-case.

Of course, the major benefit of using an async library is being able to parallelize operations like this. Since gcloud-rest-storage is fully asyncio-compatible, you can use any of the builtin asyncio method to perform more complicated operations:

my_files = {
    '/local/path/to/file.1': 'path/in/gcs.1',
    '/local/path/to/file.2': 'path/in/gcs.2',
    '/local/path/to/file.3': 'different/gcs/path/filename.3',
}

async with Storage() as client:
    # Prepare all our upload data
    uploads = []
    for local_name, gcs_name in my_files.items():
        async with aiofiles.open(local_name, mode="r") as f:
            contents = await f.read()
            uploads.append((gcs_name, contents))

    # Simultaneously upload all files
    await asyncio.gather(
        *[
            client.upload('my-bucket-name', path, file_) for path, file_ in uploads
        ]
    )

You can also refer smoke test for more info and examples.

Note that you can also let gcloud-rest-storage do its own session management, so long as you give us a hint when to close that session:

async with Storage() as client:
    # closes the client.session on leaving the context manager

# OR

client = Storage()
# do stuff
await client.close()  # close the session explicitly

File Encodings

In some cases, aiohttp needs to transform the objects returned from GCS into strings, eg. for debug logging and other such issues. The built-in await response.text() operation relies on chardet for guessing the character encoding in any cases where it can not be determined based on the file metadata.

Unfortunately, this operation can be extremely slow, especially in cases where you might be working with particularly large files. If you notice odd latency issues when reading your results, you may want to set your character encoding more explicitly within GCS, eg. by ensuring you set the contentType of the relevant objects to something suffixed with ; charset=utf-8. For example, in the case of contentType='application/x-netcdf' files exhibiting latency, you could instead set contentType='application/x-netcdf; charset=utf-8. See #172 for more info!

Emulators

For testing purposes, you may want to use gcloud-rest-storage along with a local GCS emulator. Setting the $STORAGE_EMULATOR_HOST environment variable to the address of your emulator should be enough to do the trick.

For example, using fsouza/fake-gcs-server, you can do:

docker run -d -p 4443:4443 -v $PWD/my-sample-data:/data fsouza/fake-gcs-server
export STORAGE_EMULATOR_HOST='0.0.0.0:4443'

Any gcloud-rest-storage requests made with that environment variable set will query fake-gcs-server instead of the official GCS API.

Note that some emulation systems require disabling SSL – if you’re using a custom http session, you may need to disable SSL verification.

Contributing

Please see our contributing guide.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gcloud-rest-storage-6.1.0.tar.gz (18.7 kB view details)

Uploaded Source

Built Distribution

gcloud_rest_storage-6.1.0-py2.py3-none-any.whl (25.1 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file gcloud-rest-storage-6.1.0.tar.gz.

File metadata

  • Download URL: gcloud-rest-storage-6.1.0.tar.gz
  • Upload date:
  • Size: 18.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.14.0 pkginfo/1.7.0 requests/2.25.1 setuptools/57.0.0 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.9.5

File hashes

Hashes for gcloud-rest-storage-6.1.0.tar.gz
Algorithm Hash digest
SHA256 e8832a46863bc9a27021245c3e2cf9f4da57f220f6f6fcead0eaabdf48c20406
MD5 380c40f106a24f70822cf12e258412b1
BLAKE2b-256 632903aef8e0a3af4abdc8a7e8b3530131e21b4f10af251fc8d6d16827b48342

See more details on using hashes here.

File details

Details for the file gcloud_rest_storage-6.1.0-py2.py3-none-any.whl.

File metadata

  • Download URL: gcloud_rest_storage-6.1.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 25.1 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.14.0 pkginfo/1.7.0 requests/2.25.1 setuptools/57.0.0 requests-toolbelt/0.9.1 tqdm/4.61.0 CPython/3.9.5

File hashes

Hashes for gcloud_rest_storage-6.1.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 6f33fe43f4d2be60238d1e6922284ff985d62c1c93bfc11740a46a27d775fe5e
MD5 bb40b6f060c2e35e38d5c4759b34b757
BLAKE2b-256 e076e2bcfe03a9510620af3014bc2fa5b1c64d6eaa7266596d8e46f33e427a84

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page