Skip to main content

Logging aws_logging_handlers to AWS services that support S3 and Kinesis stream logging with multiple threads

Project description

A python multithreaded logging handler package that streams records to AWS services objects with support for the following AWS services: * S3 * Kinesis

Supports gzip compression(in S3)

Getting Started


Asynchronous multipart uploading relies on the ability to use multiple threads #### Packages:



Installation using pip

pip install aws-logging-handlers


Stream log records to S3 and Kinesis

import logging
from aws_logging_handlers import S3Handler, KinesisHandler

bucket="test_bucket" # The bucket should already exist

# The log will be rotated to a new object either when an object reaches 5 MB or when 120 seconds pass from the last rotation/initial logging
s3_handler = S3Handler("test_log", bucket, workers=3)
kinesis_handler = KinesisHandler('log_test', 'us-east-1', workers=1)
formatter = logging.Formatter('[%(asctime)s] {%(filename)s:%(lineno)d} %(levelname)s - %(message)s')
logger = logging.getLogger('root')

for i in range(0, 100000):"test info message")
    logger.warning("test warning message")
    logger.error("test error message")

To be developed

  • Support for asyncio
  • Logging and upload metrics


This project is licensed under the MIT License - see the file for details

Project details

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Filename, size & hash SHA256 hash help File type Python version Upload date
aws-logging-handlers-2.0.2.tar.gz (9.0 kB) Copy SHA256 hash SHA256 Source None

Supported by

Elastic Elastic Search Pingdom Pingdom Monitoring Google Google BigQuery Sentry Sentry Error logging AWS AWS Cloud computing DataDog DataDog Monitoring Fastly Fastly CDN SignalFx SignalFx Supporter DigiCert DigiCert EV certificate StatusPage StatusPage Status page