Django DataDog Logger integration package.
Project description
Django DataDog Logger
Django DataDog Logger integration package.
- Free software: MIT license
- Documentation: https://django-datadog-logger.readthedocs.io.
Quick start
Set up request id tracking (in front) and logging middlewares (at the end):
MIDDLEWARE = [ "django_datadog_logger.middleware.request_id.RequestIdMiddleware", # ... "django_datadog_logger.middleware.error_log.ErrorLoggingMiddleware", "django_datadog_logger.middleware.request_log.RequestLoggingMiddleware", ]
Configure LOGGERS in your Django settings file:
API_LOG_ROOT = env.str("API_LOG_ROOT") LOGGING = { "version": 1, "disable_existing_loggers": False, "formatters": { "console": {"format": "{levelname} {message}", "style": "{"}, "json": {"()": "django_datadog_logger.formatters.datadog.DataDogJSONFormatter"}, }, "handlers": { "console": {"level": "INFO", "class": "logging.StreamHandler", "formatter": "console"}, "application": { "level": API_LOG_APPLICATION_LEVEL, "class": "logging.FileHandler", "filename": os.path.join(API_LOG_ROOT, "api.application.log"), "formatter": "json", }, "state": { "level": API_LOG_STATE_LEVEL, "class": "logging.FileHandler", "filename": os.path.join(API_LOG_ROOT, "api.state.log"), "formatter": "json", }, "request": { "level": API_LOG_REQUEST_LEVEL, "class": "logging.FileHandler", "filename": os.path.join(API_LOG_ROOT, "api.request.log"), "formatter": "json", }, "session": { "level": API_LOG_SESSION_LEVEL, "class": "logging.FileHandler", "filename": os.path.join(API_LOG_ROOT, "api.session.log"), "formatter": "json", }, "error": { "level": API_LOG_ERROR_LEVEL, "class": "logging.FileHandler", "filename": os.path.join(API_LOG_ROOT, "api.error.log"), "formatter": "json", }, }, "loggers": { "": {"handlers": ["console", "error"], "level": "DEBUG", "propagate": True}, "ddtrace": {"handlers": ["error"], "level": "ERROR", "propagate": False}, "django.db.backends": {"handlers": ["error"], "level": "ERROR", "propagate": False}, "twilio": {"handlers": ["error"], "level": "ERROR", "propagate": False}, "my_project": {"handlers": ["application"], "level": "INFO", "propagate": False}, "my_project.throttling": {"handlers": ["application"], "level": "DEBUG", "propagate": False}, "my_project.vehicles.viewsets.state": {"handlers": ["state"], "level": "INFO", "propagate": False}, "my_project.accounts.session": {"handlers": ["session"], "level": "DEBUG", "propagate": False}, "my_project.session": {"handlers": ["session"], "level": "DEBUG", "propagate": False}, "django_auth_ldap": {"level": "DEBUG", "handlers": ["session"], "propagate": False}, "django_datadog_logger.middleware.error_log": {"handlers": ["error"], "level": "INFO", "propagate": False}, "django_datadog_logger.middleware.request_log": {"handlers": ["request"], "level": "INFO", "propagate": False}, "django_datadog_logger.rest_framework": {"handlers": ["application"], "level": "INFO", "propagate": False}, }, } DJANGO_DATADOG_LOGGER_EXTRA_INCLUDE = r"^(django_datadog_logger|my_project)(|\..+)$"
Add Celery logger configuration and request_id tracking decorator to tasks:
import logging from celery import Celery, shared_task from celery.result import AsyncResult from celery.signals import after_setup_logger, after_setup_task_logger from django.conf import settings from django_datadog_logger.celery import store_celery_request logger = logging.getLogger(__name__) @after_setup_logger.connect def on_after_setup_logger(logger, *args, **kwargs): from django_datadog_logger.formatters.datadog import DataDogJSONFormatter if settings.API_LOG_CELERY_JSON: formatter = DataDogJSONFormatter() for handler in list(logger.handlers): handler.setFormatter(formatter) handler.setLevel(settings.API_LOG_CELERY_LEVEL) @after_setup_task_logger.connect def on_after_setup_task_logger(logger, *args, **kwargs): from django_datadog_logger.formatters.datadog import DataDogJSONFormatter if settings.API_LOG_CELERY_JSON: formatter = DataDogJSONFormatter() for handler in list(logger.handlers): handler.setFormatter(formatter) handler.setLevel(settings.API_LOG_CELERY_LEVEL) app = Celery("my_project") # Using a string here means the worker will not have to # pickle the object when using Windows. app.config_from_object("django.conf:settings", namespace="CELERY") app.autodiscover_tasks(lambda: settings.INSTALLED_APPS) @shared_task(bind=True) @store_celery_request def debug_task(self): print("Request: {0!r}".format(self.request)) logger.critical("CRITICAL", extra={"level": "CRITICAL"}) logger.error("ERROR", extra={"level": "ERROR"}) logger.warning("WARNING", extra={"level": "WARNING"}) logger.info("INFO", extra={"level": "INFO"}) logger.debug("DEBUG", extra={"level": "DEBUG"}) return 42
ddtrace
The ddtrace library has an option to inject tracing context data into log records: https://ddtrace.readthedocs.io/en/stable/advanced_usage.html#logs-injection
There is a helper to look for those attributes and add them automatically to the log entry created by this library.
# log.py # Patch logging library to inject dd.* attributes on log records import ddtrace ddtrace.patch(logging=True) # Configure logger with DataDogJSONFormatter import logging from django_datadog_logger.formatters.datadog import DataDogJSONFormatter logger = logging.root handler = logging.StreamHandler() handler.formatter = DataDogJSONFormatter() logger.addHandler(handler) logger.setLevel(logging.INFO) # Log a test message logger.info("test")
$ DD_SERVICE=django DD_ENV=test DD_VERSION=1234 python log.py {"message": "test", "logger.name": "root", "logger.thread_name": "MainThread", "logger.method_name": "<module>", "syslog.timestamp": "2021-08-23T18:26:10.391099+00:00", "syslog.severity": "INFO", "dd.version": "1234", "dd.env": "test", "dd.service": "django", "dd.trace_id": "0", "dd.span_id": "0"}
If you remove the call to datadog.patch(logging=True) you end up with:
$ python test.py {"message": "test", "logger.name": "root", "logger.thread_name": "MainThread", "logger.method_name": "<module>", "syslog.timestamp": "2021-08-23T18:27:47.951461+00:00", "syslog.severity": "INFO"}
Credits
This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.
History
0.5.0 (2021-10-15)
- Datadog Json formatter support Celery v5+
0.4.0 (2021-08-27)
- Enhancement: Updated formatting in README.rst #5
- Enhancement: Extract and add dd.* attributes from log record to log entry dict #6
- Fixed: KeyError because a dict appears where a list is expected #7
0.3.5 (2021-06-14)
- Prevent recursion when warnings are logged whilst accessing WSGI request.user
0.3.4 (2021-06-14)
- Fixed import error for future package
0.3.3 (2020-11-04)
- Added support for incoming HTTP X-Request-ID header values
0.3.2 (2020-04-24)
- Respect User.USERNAME_FIELD
0.3.1 (2020-04-24)
- Removed API_LOG_REQUEST_DURATION_WARN_SECONDS
0.3.0 (2020-04-15)
- Improved Celery task received messages logging.
- Removed RequestIdFilter (not needed anymore).
0.2.0 (2020-04-14)
- Added Celery request local.
0.1.0 (2020-02-17)
- First release on PyPI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for django-datadog-logger-0.5.0.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 7b901fae07732ae70daf0a9e1ce72d713d8caa004c2b0612f83157edb1d5d588 |
|
MD5 | 8f59d951dbab7fde83259f473691fd37 |
|
BLAKE2-256 | 55126ce930da5d698c04f90d08857962e11580d0fe6a635b1c10933a142e367a |
Hashes for django_datadog_logger-0.5.0-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 8ec6e8a769a1da828046c021372bf21e632d08ab41a34735315a62d584079b18 |
|
MD5 | 399b22d50587e61434973ea059583019 |
|
BLAKE2-256 | 12b62dec6246a301a8b97d717d1a187077ffc362ae96cc858f6b8ca08c49d9be |