Skip to main content

Lightweight HTTP client for ClickHouse

Project description

ClickHouse HTTP Driver (clickhouse-driver-http)

Lightweight Python HTTP client for ClickHouse with minimal dependencies.

Installation

pip install clickhouse-driver-http

Quick Start

from clickhouse_driver_http import ClickHouseHTTP

# Initialize client (supports both int/string ports)
client = ClickHouseHTTP(
    host="localhost",          # Required
    port=8123,                # Default: 8123 (can be string "8123")
    username="default",       # Default: "default"
    password="",              # Default: ""
    database="default",       # Default: "default"
    timeout=300,              # Default: 300s
    max_retries=3,            # Default: 3
    compression=False         # Default: False
)

# Execute query (returns raw tuples)
result = client.execute("SELECT now() AS current_time, version()")

# Get DataFrame (pandas required)
df = client.query_to_df("""
    SELECT table, engine 
    FROM system.tables 
    WHERE database = currentDatabase()
    LIMIT 5
""")

Key Features

  • HTTP Interface - No native protocol dependency
  • External Tables - Pass data directly in queries
  • Smart Batching - Automatic chunking for large results
  • Error Resilient - Built-in retry mechanism
  • Pandas Support - Direct DataFrame conversion

Advanced Examples

External Tables

data = {
    'users': [
        {'user_id': 1, 'name': 'Alice'},
        {'user_id': 2, 'name': 'Bob'}
    ]
}

external = [{
    'name': 'ext_users',
    'structure': [('user_id', 'UInt32'), ('name', 'String')],
    'data': data['users']
}]

df = client.query_to_df("""
    SELECT u.name, count() as logins
    FROM ext_users u
    JOIN system.query_log q ON q.user = u.name
    GROUP BY u.name
""", external_tables=external)

Batch Processing

# Processes in 100k row batches with auto-retry
large_df = client.query_to_df(
    "SELECT * FROM billion_row_table",
    batch_size=100000,  # Initial batch size
    memory_safe=True    # Enables memory limits
)

Configuration Reference

Parameter Type Default Description
host str - Server hostname or IP
port int/str 8123 HTTP interface port
username str "default" Authentication username
password str "" Authentication password
database str "default" Default database context
timeout int 300 Query timeout in seconds
max_retries int 3 Connection retry attempts
compression bool False Enable gzip/deflate compression
verify_ssl bool True Verify SSL certificates

License

MIT License - See LICENSE for full text.

Note: Requires Python 3.7+ and requests package. Pandas needed for DataFrame support.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

clickhouse-driver-http-0.1.3.tar.gz (6.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

clickhouse_driver_http-0.1.3-py3-none-any.whl (6.5 kB view details)

Uploaded Python 3

File details

Details for the file clickhouse-driver-http-0.1.3.tar.gz.

File metadata

  • Download URL: clickhouse-driver-http-0.1.3.tar.gz
  • Upload date:
  • Size: 6.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.9.6

File hashes

Hashes for clickhouse-driver-http-0.1.3.tar.gz
Algorithm Hash digest
SHA256 a31fda9f7486c4b1cf2ea9d4736b486394ac1b5968fd41aa46f9c1cfc1ea7244
MD5 9ad9cad84f6b47a20e8a682753813461
BLAKE2b-256 166a5c1b153f0aaee9bbbdb395c0d7046e6528e84ef724b2acf32d1196b85c9e

See more details on using hashes here.

File details

Details for the file clickhouse_driver_http-0.1.3-py3-none-any.whl.

File metadata

File hashes

Hashes for clickhouse_driver_http-0.1.3-py3-none-any.whl
Algorithm Hash digest
SHA256 b9316c2d39b8fc34ada0e60b183599187ad1b79f3e7e307a8f7207eb2d46f5a4
MD5 9e5ebeebd4cdbb1b124ff68e3c90ceb7
BLAKE2b-256 5e4bfe1ac3d3bf0ebbcdcd50f46adf0e004419406a74ca16ec7ac3e733e1093d

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page