Django DataDog Logger integration package.
Project description
Django DataDog Logger
Django DataDog Logger integration package.
Free software: MIT license
Documentation: https://django-datadog-logger.readthedocs.io.
Quick start
Set up request id tracking (in front) and logging middlewares (at the end):
MIDDLEWARE = [
"django_datadog_logger.middleware.request_id.RequestIdMiddleware",
# ...
"django_datadog_logger.middleware.error_log.ErrorLoggingMiddleware",
"django_datadog_logger.middleware.request_log.RequestLoggingMiddleware",
]
Configure LOGGERS in your Django settings file:
API_LOG_ROOT = env.str("API_LOG_ROOT")
LOGGING = {
"version": 1,
"disable_existing_loggers": False,
"formatters": {
"console": {"format": "{levelname} {message}", "style": "{"},
"json": {"()": "django_datadog_logger.formatters.datadog.DataDogJSONFormatter"},
},
"handlers": {
"console": {"level": "INFO", "class": "logging.StreamHandler", "formatter": "console"},
"application": {
"level": API_LOG_APPLICATION_LEVEL,
"class": "logging.FileHandler",
"filename": os.path.join(API_LOG_ROOT, "api.application.log"),
"formatter": "json",
},
"state": {
"level": API_LOG_STATE_LEVEL,
"class": "logging.FileHandler",
"filename": os.path.join(API_LOG_ROOT, "api.state.log"),
"formatter": "json",
},
"request": {
"level": API_LOG_REQUEST_LEVEL,
"class": "logging.FileHandler",
"filename": os.path.join(API_LOG_ROOT, "api.request.log"),
"formatter": "json",
},
"session": {
"level": API_LOG_SESSION_LEVEL,
"class": "logging.FileHandler",
"filename": os.path.join(API_LOG_ROOT, "api.session.log"),
"formatter": "json",
},
"error": {
"level": API_LOG_ERROR_LEVEL,
"class": "logging.FileHandler",
"filename": os.path.join(API_LOG_ROOT, "api.error.log"),
"formatter": "json",
},
},
"loggers": {
"": {"handlers": ["console", "error"], "level": "DEBUG", "propagate": True},
"ddtrace": {"handlers": ["error"], "level": "ERROR", "propagate": False},
"django.db.backends": {"handlers": ["error"], "level": "ERROR", "propagate": False},
"twilio": {"handlers": ["error"], "level": "ERROR", "propagate": False},
"my_project": {"handlers": ["application"], "level": "INFO", "propagate": False},
"my_project.throttling": {"handlers": ["application"], "level": "DEBUG", "propagate": False},
"my_project.vehicles.viewsets.state": {"handlers": ["state"], "level": "INFO", "propagate": False},
"my_project.accounts.session": {"handlers": ["session"], "level": "DEBUG", "propagate": False},
"my_project.session": {"handlers": ["session"], "level": "DEBUG", "propagate": False},
"django_auth_ldap": {"level": "DEBUG", "handlers": ["session"], "propagate": False},
"django_datadog_logger.middleware.error_log": {"handlers": ["error"], "level": "INFO", "propagate": False},
"django_datadog_logger.middleware.request_log": {"handlers": ["request"], "level": "INFO", "propagate": False},
"django_datadog_logger.rest_framework": {"handlers": ["application"], "level": "INFO", "propagate": False},
},
}
DJANGO_DATADOG_LOGGER_EXTRA_INCLUDE = r"^(django_datadog_logger|my_project)(|\..+)$"
Add Celery logger configuration and request_id tracking decorator to tasks:
import logging
from celery import Celery, shared_task
from celery.result import AsyncResult
from celery.signals import after_setup_logger, after_setup_task_logger
from django.conf import settings
from django_datadog_logger.celery import store_celery_request
logger = logging.getLogger(__name__)
@after_setup_logger.connect
def on_after_setup_logger(logger, *args, **kwargs):
from django_datadog_logger.formatters.datadog import DataDogJSONFormatter
if settings.API_LOG_CELERY_JSON:
formatter = DataDogJSONFormatter()
for handler in list(logger.handlers):
handler.setFormatter(formatter)
handler.setLevel(settings.API_LOG_CELERY_LEVEL)
@after_setup_task_logger.connect
def on_after_setup_task_logger(logger, *args, **kwargs):
from django_datadog_logger.formatters.datadog import DataDogJSONFormatter
if settings.API_LOG_CELERY_JSON:
formatter = DataDogJSONFormatter()
for handler in list(logger.handlers):
handler.setFormatter(formatter)
handler.setLevel(settings.API_LOG_CELERY_LEVEL)
app = Celery("my_project")
# Using a string here means the worker will not have to
# pickle the object when using Windows.
app.config_from_object("django.conf:settings", namespace="CELERY")
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
@shared_task(bind=True)
@store_celery_request
def debug_task(self):
print("Request: {0!r}".format(self.request))
logger.critical("CRITICAL", extra={"level": "CRITICAL"})
logger.error("ERROR", extra={"level": "ERROR"})
logger.warning("WARNING", extra={"level": "WARNING"})
logger.info("INFO", extra={"level": "INFO"})
logger.debug("DEBUG", extra={"level": "DEBUG"})
return 42
ddtrace
The ddtrace library has an option to inject tracing context data into log records: https://ddtrace.readthedocs.io/en/stable/advanced_usage.html#logs-injection
There is a helper to look for those attributes and add them automatically to the log entry created by this library.
# log.py
# Patch logging library to inject dd.* attributes on log records
import ddtrace
ddtrace.patch(logging=True)
# Configure logger with DataDogJSONFormatter
import logging
from django_datadog_logger.formatters.datadog import DataDogJSONFormatter
logger = logging.root
handler = logging.StreamHandler()
handler.formatter = DataDogJSONFormatter()
logger.addHandler(handler)
logger.setLevel(logging.INFO)
# Log a test message
logger.info("test")
$ DD_SERVICE=django DD_ENV=test DD_VERSION=1234 python log.py
{"message": "test", "logger.name": "root", "logger.thread_name": "MainThread", "logger.method_name": "<module>", "syslog.timestamp": "2021-08-23T18:26:10.391099+00:00", "syslog.severity": "INFO", "dd.version": "1234", "dd.env": "test", "dd.service": "django", "dd.trace_id": "0", "dd.span_id": "0"}
If you remove the call to datadog.patch(logging=True) you end up with:
$ python test.py
{"message": "test", "logger.name": "root", "logger.thread_name": "MainThread", "logger.method_name": "<module>", "syslog.timestamp": "2021-08-23T18:27:47.951461+00:00", "syslog.severity": "INFO"}
Credits
This package was created with Cookiecutter and the audreyr/cookiecutter-pypackage project template.
History
0.5.4 (2023-01-16)
Added support for HTTP Accept header as http.accept
0.5.3 (2022-12-19)
Added support for JWT cid claim
0.5.2 (2022-11-23)
Fixed: don’t let the logger throw a DisallowedHost error
0.5.1 (2022-08-09)
Fixed: ActionLoginMixin class methods perform_create and perform_update call super(). Remove atomic transaction
0.5.0 (2021-10-20)
Added support for Celery v5+
0.4.0 (2021-08-27)
Enhancement: Updated formatting in README.rst #5
Enhancement: Extract and add dd.* attributes from log record to log entry dict #6
Fixed: KeyError because a dict appears where a list is expected #7
0.3.5 (2021-06-14)
Prevent recursion when warnings are logged whilst accessing WSGI request.user
0.3.4 (2021-06-14)
Fixed import error for future package
0.3.3 (2020-11-04)
Added support for incoming HTTP X-Request-ID header values
0.3.2 (2020-04-24)
Respect User.USERNAME_FIELD
0.3.1 (2020-04-24)
Removed API_LOG_REQUEST_DURATION_WARN_SECONDS
0.3.0 (2020-04-15)
Improved Celery task received messages logging.
Removed RequestIdFilter (not needed anymore).
0.2.0 (2020-04-14)
Added Celery request local.
0.1.0 (2020-02-17)
First release on PyPI.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for django-datadog-logger-0.5.4.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | e276c0fa647067237bb4c7374c06f4803eaf4cbded8d02d6f89706cb6545912e |
|
MD5 | b2465efa8888c41f4f662bce889e3e37 |
|
BLAKE2b-256 | 3393db7ea13940808afa323f8c98ba7ed790562e4aea1f22165b06971aeef461 |
Hashes for django_datadog_logger-0.5.4-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 5fa51ba8f61dcb179af8f0f666baca7fd7a3a5dca5bec19797f3e2009872a014 |
|
MD5 | 584d874a12f24c646c13598a23fa4dcb |
|
BLAKE2b-256 | 73b8b2ebd5704d59e752fe3ddc22b6ae606a24d0d89bc1e179ea6f9cd84d458e |