Skip to main content

Logging aws_logging_handlers to AWS services that support S3 and Kinesis stream logging with multiple threads

Project description

A python multithreaded logging handler package that streams records to AWS services objects with support for the following AWS services: * S3 * Kinesis

Supports gzip compression(in S3)

Getting Started


Asynchronous multipart uploading relies on the ability to use multiple threads #### Packages:



Installation using pip

pip install aws-logging-handlers


Stream log records to S3 and Kinesis

import logging
from aws_logging_handlers.S3 import S3Handler
from aws_logging_handlers.Kinesis import KinesisHandler

bucket="test_bucket" # The bucket should already exist

# The log will be rotated to a new object either when an object reaches 5 MB or when 120 seconds pass from the last rotation/initial logging
s3_handler = S3Handler("test_log", bucket, workers=3)
kinesis_handler = KinesisHandler('log_test', 'us-east-1', workers=1)
formatter = logging.Formatter('[%(asctime)s] %(filename)s:%(lineno)d} %(levelname)s - %(message)s')
logger = logging.getLogger('root')

for i in range(0, 100000):"test info message")
    logger.warning("test warning message")
    logger.error("test error message")


To be developed

  • Support for asyncio

  • Logging and upload metrics


This project is licensed under the MIT License - see the file for details

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

aws-logging-handlers-2.0.5.tar.gz (8.5 kB view hashes)

Uploaded source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page