Skip to main content

No project description provided

Project description

LogLite

A lightweight, high-performance logging service that stores log data in SQLite with HTTP APIs for log insertion and querying.

Features

  • Lightweight & Efficient: Built with performance in mind using fully async libraries.
  • Web interfaces: Insert and query logs via straightforward REST endpoints.
  • SQLite Backend: Store log messages in SQLite, enabling fast and complex queries.
  • Database Migrations: Built-in migration utilities to manage database schema changes.

Installation

pip install loglite

Configuration

LogLite requires a YAML configuration file. Here's a sample configuration:

# Server configuration
host: 0.0.0.0  # Web API server bind host
port: 7788  # Web API server bind port
debug: true  # More verbose logging when enabled
log_table_name: Log  # Name of the main log entry table in SQLite
db_dir: ./db  # Directory for SQLite database
allow_origin: "*"  # CORS configuration (default: *)

# Database migrations
migrations:
  - version: 1  # Incremental migration version
    rollout:  # Raw SQLite statements
      - |
        CREATE TABLE Log (
            id INTEGER PRIMARY KEY AUTOINCREMENT,
            timestamp DATETIME NOT NULL,
            message TEXT NOT NULL,
            level TEXT NOT NULL CHECK (level IN ('DEBUG', 'INFO', 'WARNING', 'ERROR', 'CRITICAL')),
            service TEXT NOT NULL,
            filename TEXT,
            path TEXT,
            line INTEGER,
            function TEXT,
            pid INTEGER,
            process_name TEXT,
            extra JSON
        );
      - CREATE INDEX IF NOT EXISTS idx_timestamp ON Log(timestamp);
      - CREATE INDEX IF NOT EXISTS idx_level ON Log(level);
      - CREATE INDEX IF NOT EXISTS idx_service ON Log(service);
    rollback:  # SQL statements to apply when rolling back
      - DROP INDEX IF EXISTS idx_service;
      - DROP INDEX IF EXISTS idx_level;
      - DROP INDEX IF EXISTS idx_timestamp;
      - DROP TABLE IF EXISTS Log;

Required Configuration Items

  • migrations: At least one migration must be defined with version, rollout and rollback statements
    • version: A unique integer for each migration
    • rollout: SQL statements to apply when migrating forward
    • rollback: SQL statements to apply when rolling back

Other items are optional with sensible defaults.

Usage

Running Migrations

Before starting the server, you need to apply migrations to set up the database schema:

loglite migrate rollout -c /path/to/config.yaml

Starting the Server

To start the LogLite server:

loglite server run -c /path/to/config.yaml

Rolling Back Migrations

If you need to roll back a specific migration version (e.g., version id = 3):

loglite migrate rollback -c /path/to/config.yaml -v 3

Add the -f flag to force rollback without confirmation.

API Endpoints

POST /logs

curl -X POST http://localhost:7788/logs \
  -H "Content-Type: application/json" \
  -d '{
    "timestamp": "2023-04-01T12:34:56",
    "message": "This is a test log message",
    "level": "INFO",
    "service": "my-service",
    "extra": {"request_id": "12345"}
  }'

GET /logs

Query logs with filters. Each query parameter specifies the field, operation (=, ~=, !=, >=, <=, >, <) and value. The following are special query parameters that do not require operator, just provide the exact value:

  • fields: Comma-separated list of fields to return, defaults to "*" (select all fields).
  • limit: Maximum number of logs to return.
  • offset: Offset in the result set.

Example request:

curl "http://localhost:7788/logs?fields=message,timestamp&limit=10&offset=0&level=>INFO&service==backend&timestamp=>=2023-04-01T00:00:00"

Example response:

{
    "status": "success",
    "result": {
        "total": 3,
        "offset": 0,
        "limit": 2,
        "results": [
            {
                "timestamp": "2025-03-06T10:44:04.207515",
                "message": "hello world!"
            },
            {
                "timestamp": "2025-03-08T11:44:04.207515",
                "message": "hello world!"
            }
        ]
    }
}

TODO:

  • Add basic documentation
  • Customize SQLite configuration
  • Mark some columns as "enums", silently create a "Enums" table which the main log table points to. Gradually grow the enums table to captures all distinct values of that column. This will greatly reduce the storage space 👍.
  • Partition SQLite databases by date or month
  • Allow redirecting logs to local file
  • Add tests

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

loglite-0.1.0.tar.gz (11.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

loglite-0.1.0-py3-none-any.whl (11.7 kB view details)

Uploaded Python 3

File details

Details for the file loglite-0.1.0.tar.gz.

File metadata

  • Download URL: loglite-0.1.0.tar.gz
  • Upload date:
  • Size: 11.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.5 Darwin/23.5.0

File hashes

Hashes for loglite-0.1.0.tar.gz
Algorithm Hash digest
SHA256 87bc837b1315910aad0ffac4ceb57735740840f8779f0d14b0669c52189af474
MD5 8f6b7615742c4b140bc42d61ae3bc6ec
BLAKE2b-256 c286f3e01bc9c784c51c0cf582bdd6d9d74b708e9386095c837bd984114c937c

See more details on using hashes here.

File details

Details for the file loglite-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: loglite-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 11.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.5 Darwin/23.5.0

File hashes

Hashes for loglite-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 ac10d141d8ae5f833aa24e78c047a864185d3c4a8e0a2f353bcf188facf7e835
MD5 8043b7c0619495c63295ea3e188105af
BLAKE2b-256 6b68c5085558fcac19fa56d6e1de6138fd839d764e00b55a039191b523497627

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page