Skip to main content

MySQL database adapter for Datus

Project description

datus-mysql

MySQL database adapter for Datus.

Installation

pip install datus-mysql

This will automatically install the required dependencies:

  • datus-agent
  • datus-sqlalchemy
  • pymysql

Usage

The adapter is automatically registered with Datus when installed. Configure your database connection in your Datus configuration:

database:
  type: mysql
  host: localhost
  port: 3306
  username: root
  password: your_password
  database: your_database

Or use programmatically:

from datus_mysql import MySQLConnector, MySQLConfig

# Using config object
config = MySQLConfig(
    host="localhost",
    port=3306,
    username="root",
    password="your_password",
    database="mydb"
)
connector = MySQLConnector(config)

# Or using dict
connector = MySQLConnector({
    "host": "localhost",
    "port": 3306,
    "username": "root",
    "password": "your_password",
    "database": "mydb"
})

# Test connection
connector.test_connection()

# Execute query
result = connector.execute({"sql_query": "SELECT * FROM users LIMIT 10"})
print(result.sql_return)

# Get table list
tables = connector.get_tables(database_name="mydb")
print(f"Tables: {tables}")

# Get table schema
schema = connector.get_schema(database_name="mydb", table_name="users")
for column in schema:
    print(f"{column['name']}: {column['type']}")

Features

  • Full CRUD operations (SELECT, INSERT, UPDATE, DELETE)
  • DDL execution (CREATE, ALTER, DROP)
  • Metadata retrieval (tables, views, schemas)
  • Sample data extraction
  • Multiple result formats (pandas, arrow, csv, list)
  • Connection pooling and management
  • Comprehensive error handling

Testing

Quick Start

# Unit tests (no database required)
cd datus-mysql
uv run pytest tests/unit/ -v

# All tests with coverage
uv run pytest tests/ -v --cov=datus_mysql --cov-report=term-missing

Integration Tests (Requires MySQL)

# Start MySQL container
cd datus-mysql
docker compose up -d

# Run integration tests
uv run pytest tests/integration/ -m integration -v

# Run TPC-H tests only
uv run pytest tests/integration/test_tpch.py -m integration -v

# Run all acceptance tests (unit + integration)
uv run pytest tests/ -m acceptance -v

# Stop MySQL
docker compose down

TPC-H Test Data

The integration tests include TPC-H benchmark data for comprehensive testing:

Table Rows Description
tpch_region 5 Standard TPC-H regions
tpch_nation 25 Standard TPC-H nations
tpch_customer 10 Simplified customer data
tpch_orders 15 Simplified order data
tpch_supplier 5 Simplified supplier data

The tpch_setup fixture (session-scoped) automatically creates tables, inserts data, and cleans up after tests complete.

Initialize TPC-H Data Manually

You can also initialize TPC-H data manually using the provided script:

cd datus-mysql

# Using defaults (from docker-compose.yml)
uv run python scripts/init_tpch_data.py

# With custom connection
uv run python scripts/init_tpch_data.py --host localhost --port 3306 --username test_user --password test_password

# Drop existing tables first (clean re-init)
uv run python scripts/init_tpch_data.py --drop

Test Statistics

  • Unit Tests: 50 tests (config, connector, identifiers)
  • Integration Tests: 31 tests (20 functional + 11 TPC-H)
  • Acceptance Tests: 23 marked tests (subset of unit + integration)
  • Total: 81 tests

Test Markers

Marker Description
integration Requires a running MySQL instance
acceptance Core functionality tests (subset of unit + integration)

Code Structure

datus-mysql/
├── datus_mysql/
│   ├── __init__.py          # Package exports
│   ├── config.py            # MySQLConfig model
│   └── connector.py         # MySQLConnector implementation
├── tests/
│   ├── unit/
│   │   └── ...              # Unit tests (no database required)
│   └── integration/
│       ├── conftest.py      # Fixtures (config, connector, tpch_setup)
│       ├── test_integration.py  # Core integration tests
│       └── test_tpch.py     # TPC-H benchmark tests
├── scripts/
│   └── init_tpch_data.py    # Manual TPC-H data initialization
├── docker-compose.yml       # MySQL 8.0 test container
├── pyproject.toml
└── README.md

Development

Setup

# Install dependencies
uv sync

# Install in editable mode
uv pip install -e .

Running Tests

# Fast unit tests
uv run pytest tests/unit/ -v

# With coverage
uv run pytest tests/ --cov=datus_mysql --cov-report=html
open htmlcov/index.html

Code Quality

# Format code
black datus_mysql tests
isort datus_mysql tests

# Lint
ruff check datus_mysql tests
flake8 datus_mysql tests

Requirements

  • Python >= 3.12
  • MySQL >= 5.7 or MariaDB >= 10.2
  • datus-agent >= 0.3.0
  • datus-sqlalchemy >= 0.1.0
  • pymysql >= 1.0.0

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datus_mysql-0.1.7rc1.tar.gz (18.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

datus_mysql-0.1.7rc1-py3-none-any.whl (8.3 kB view details)

Uploaded Python 3

File details

Details for the file datus_mysql-0.1.7rc1.tar.gz.

File metadata

  • Download URL: datus_mysql-0.1.7rc1.tar.gz
  • Upload date:
  • Size: 18.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for datus_mysql-0.1.7rc1.tar.gz
Algorithm Hash digest
SHA256 fd50ea2e6df033cf1bf1598a05ca719ce4d0c8be9022b117c5d433f6adf08a2a
MD5 aced33203df6f77398402ddff7de72cc
BLAKE2b-256 853936297ebba1b0f63011d0f8ba8a336686631be6c1d2d3c1db613eeb9bf359

See more details on using hashes here.

File details

Details for the file datus_mysql-0.1.7rc1-py3-none-any.whl.

File metadata

  • Download URL: datus_mysql-0.1.7rc1-py3-none-any.whl
  • Upload date:
  • Size: 8.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for datus_mysql-0.1.7rc1-py3-none-any.whl
Algorithm Hash digest
SHA256 730f1a0558aa34d4263cf6ad8ecdc02c9eeb753b780257df67a0b1a419f66a78
MD5 3cb31d4aa7c471e83105ca750a06026f
BLAKE2b-256 f2789b45dd7b8828f6fa3556c4057d224ea35a24fe6b1c452d4db7150792751f

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page