Skip to main content

No project description provided

Project description

LogLite Logo

A lightweight, high-performance logging service with SQLite and async RESTful API.

  • ⚡️ Lightweight & Efficient: Built with performance in mind using fully async libraries (aiohttp, aiofiles) and orjson to boost JSON serialization.
  • 🔧 Fully customizable table schema: Make no assumptions about the log table structure, define your own schema to fit your needs.
  • 💾 SQLite Backend: Store log messages in SQLite, enabling efficient and complex queries.
  • 🔄 Database Migrations: Built-in migration utilities to manage database schema changes.
  • 🌐 Web API: RESTful endpoint for log ingestion and query. Support server-sent events (SSE) for real-time log streaming.
  • ✨✨✨ More cool features in my wishlist:
    • Bulk insert: Buffer log entries in memory for a short while or when a limit is reached, and bulk insert them into the database.
    • Column based compression: Mark some columns as "enums", silently create a "Enums" table which the main log table points to. Gradually grow the enums table to captures all distinct values of that column.
    • Time based partitioning: One SQLite database per date or month.
    • Just a logging handler: Allow to be used as a basic logging handler without the Web API part.
    • Log redirection: When used as service, allow redirecting logs to local file or other external sink.
    • More ingestion interfaces: Support log ingestion through ZeroMQ, TCP socket and Unix socket.
    • CLI utilities: More CLI utilities to directly query the database, and export the query results to a file.

Installation

pip install loglite

Configuration

LogLite requires a YAML configuration file. Here's a sample configuration:

# Web API server bind host
host: 0.0.0.0
# Web API server bind port
port: 7788
# More verbose logging when enabled
debug: true
# Name of the main log entry table in SQLite, **required**
log_table_name: Log
# Name of the column storing the log timestamp, used for removing logs older than N days (default: timestamp)
log_timestamp_field: timestamp
# Directory for SQLite database
sqlite_dir: ./db
# CORS configuration (default: *)
allow_origin: "*"
# Maximum number of logs to push in a single SSE event payload (default: 1000)
sse_limit: 1000
# Debounce time in milliseconds for SSE, logs may be pushed later if they arrive too quickly (default: 500)
sse_debounce_ms: 500
# Remove logs older than this number of days (default: 3650 days)
vacuum_max_days: 7
# Remove the oldest logs when the db size exceeds this value (default: 1TB)
vacuum_max_size: 500MB
# When above triggered, remove the oldest logs until the db size is blow this value (default: 800GB)
vacuum_target_size: 400MB
# You can set any SQLite parameters, no default values
sqlite_params:
  journal_mode: WAL
  synchronous: NORMAL
  cache_size: -32000  # 32MB
  foreign_keys: OFF
  temp_store: MEMORY
  mmap_size: 52428800  # 50MB

# Database migrations, **required**
migrations:
  - version: 1  # Incremental migration version
    rollout:  # Raw SQLite statements
      - |
        CREATE TABLE Log (
            id INTEGER PRIMARY KEY AUTOINCREMENT,
            timestamp DATETIME NOT NULL,
            message TEXT NOT NULL,
            level TEXT NOT NULL CHECK (level IN ('DEBUG', 'INFO', 'WARNING', 'ERROR', 'CRITICAL')),
            service TEXT NOT NULL,
            filename TEXT,
            path TEXT,
            line INTEGER,
            function TEXT,
            pid INTEGER,
            process_name TEXT,
            extra JSON
        );
      - CREATE INDEX IF NOT EXISTS idx_timestamp ON Log(timestamp);
      - CREATE INDEX IF NOT EXISTS idx_level ON Log(level);
      - CREATE INDEX IF NOT EXISTS idx_service ON Log(service);
    rollback:  # SQL statements to apply when rolling back
      - DROP INDEX IF EXISTS idx_service;
      - DROP INDEX IF EXISTS idx_level;
      - DROP INDEX IF EXISTS idx_timestamp;
      - DROP TABLE IF EXISTS Log;

Required Configuration Items

  • log_table_name: Name of the main log entry table in SQLite.
  • migrations: At least one migration must be defined with version, rollout and rollback statements
    • version: A unique integer for each migration
    • rollout: SQL statements to apply when migrating forward
    • rollback: SQL statements to apply when rolling back

Other items are optional with sensible defaults.

Usage

Running Migrations

Before starting the server, you need to apply migrations to set up the database schema:

loglite migrate rollout -c /path/to/config.yaml

Starting the Server

To start the LogLite server:

loglite server run -c /path/to/config.yaml

Rolling Back Migrations

If you need to roll back a specific migration version (e.g., version id = 3):

loglite migrate rollback -c /path/to/config.yaml -v 3

Add the -f flag to force rollback without confirmation.

API Endpoints

POST /logs

Insert a new log entry. The payload format must be consistent with your log table schema.

curl -X POST http://localhost:7788/logs \
  -H "Content-Type: application/json" \
  -d '{
    "timestamp": "2023-04-01T12:34:56",
    "message": "This is a test log message",
    "level": "INFO",
    "service": "my-service",
    "extra": {"request_id": "12345"}
  }'

GET /logs

Query logs with filters. Each query parameter specifies a field and its filters. A filter defines the operator (e.g. =, ~=, !=, >=, <=, >, <) and the value. Filters are comma-separated.

Example request:

curl "http://localhost:7788/logs?fields=message,timestamp&limit=10&offset=0&timestamp=>=2023-04-01T00:00:00,<=2023-04-01T05:00:00&level==WARNING"

Example response:

{
    "status": "success",
    "result": {
        "total": 3,
        "offset": 0,
        "limit": 2,
        "results": [
            {
                "timestamp": "2025-04-01T02:44:04.207515",
                "message": "hello world!"
            },
            {
                "timestamp": "2025-04-01T01:44:04.207515",
                "message": "hello world!"
            }
        ]
    }
}

The following are special query parameters, just provide the exact value:

  • fields: Comma-separated list of fields to return, defaults to "*" (select all fields).
  • limit: Maximum number of logs to return.
  • offset: Offset in the result set.

GET /logs/sse

Subscribe to real-time log updates via Server-Sent Events (SSE). Query parameters only accept fields, not filtering is applied.

curl -H "Accept: text/event-stream" http://localhost:7788/logs/sse?fields=message,timestamp

Example events:

data: [{"timestamp": "2025-04-01T02:44:04.207515", "message": "first msg"}]
data: [{"timestamp": "2025-04-01T02:44:10.207515", "message": "third msg"}, {"timestamp": "2025-04-01T01:44:05.207515", "message": "second msg"}]

TODO:

  • Add basic documentation.
  • Customize SQLite configuration.
  • Implement more features in the wishlist.
  • Add tests.

License

MIT

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

loglite-0.1.7.tar.gz (20.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

loglite-0.1.7-py3-none-any.whl (23.5 kB view details)

Uploaded Python 3

File details

Details for the file loglite-0.1.7.tar.gz.

File metadata

  • Download URL: loglite-0.1.7.tar.gz
  • Upload date:
  • Size: 20.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.5 Darwin/23.5.0

File hashes

Hashes for loglite-0.1.7.tar.gz
Algorithm Hash digest
SHA256 c6d3c7bb01c1acdb806cc38749d702fd784f4c525ac11a93e300b3a0125b3f26
MD5 6cbd3f04af2a222635c42de73aea358d
BLAKE2b-256 2e40d9bc5c0f0e2456da023b345d7c5be1832642adbafd98800f72e51217f5a8

See more details on using hashes here.

File details

Details for the file loglite-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: loglite-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 23.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.8.4 CPython/3.11.5 Darwin/23.5.0

File hashes

Hashes for loglite-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 15666739e8fed321488d163b1b85eb249c8519eccb89c267ed866c1c104029ce
MD5 cb245576c7aebc0a568dd9e2a9396e3f
BLAKE2b-256 6b4819de811adcf81237278fadd2510ce28fd99f1b60469a00f2fe8613b495ed

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page