Skip to main content

MySQL database adapter for Datus

Project description

datus-mysql

MySQL database adapter for Datus.

Installation

pip install datus-mysql

This will automatically install the required dependencies:

  • datus-agent
  • datus-sqlalchemy
  • pymysql

Usage

The adapter is automatically registered with Datus when installed. Configure your database connection in your Datus configuration:

database:
  type: mysql
  host: localhost
  port: 3306
  username: root
  password: your_password
  database: your_database

Or use programmatically:

from datus_mysql import MySQLConnector, MySQLConfig

# Using config object
config = MySQLConfig(
    host="localhost",
    port=3306,
    username="root",
    password="your_password",
    database="mydb"
)
connector = MySQLConnector(config)

# Or using dict
connector = MySQLConnector({
    "host": "localhost",
    "port": 3306,
    "username": "root",
    "password": "your_password",
    "database": "mydb"
})

# Test connection
connector.test_connection()

# Execute query
result = connector.execute({"sql_query": "SELECT * FROM users LIMIT 10"})
print(result.sql_return)

# Get table list
tables = connector.get_tables(database_name="mydb")
print(f"Tables: {tables}")

# Get table schema
schema = connector.get_schema(database_name="mydb", table_name="users")
for column in schema:
    print(f"{column['name']}: {column['type']}")

Features

  • Full CRUD operations (SELECT, INSERT, UPDATE, DELETE)
  • DDL execution (CREATE, ALTER, DROP)
  • Metadata retrieval (tables, views, schemas)
  • Sample data extraction
  • Multiple result formats (pandas, arrow, csv, list)
  • Connection pooling and management
  • Comprehensive error handling

Testing

Quick Start

# Unit tests (no database required)
cd datus-mysql
uv run pytest tests/unit/ -v

# All tests with coverage
uv run pytest tests/ -v --cov=datus_mysql --cov-report=term-missing

Integration Tests (Requires MySQL)

# Start MySQL container
cd datus-mysql
docker compose up -d

# Run integration tests
uv run pytest tests/integration/ -m integration -v

# Run TPC-H tests only
uv run pytest tests/integration/test_tpch.py -m integration -v

# Run all acceptance tests (unit + integration)
uv run pytest tests/ -m acceptance -v

# Stop MySQL
docker compose down

TPC-H Test Data

The integration tests include TPC-H benchmark data for comprehensive testing:

Table Rows Description
tpch_region 5 Standard TPC-H regions
tpch_nation 25 Standard TPC-H nations
tpch_customer 10 Simplified customer data
tpch_orders 15 Simplified order data
tpch_supplier 5 Simplified supplier data

The tpch_setup fixture (session-scoped) automatically creates tables, inserts data, and cleans up after tests complete.

Initialize TPC-H Data Manually

You can also initialize TPC-H data manually using the provided script:

cd datus-mysql

# Using defaults (from docker-compose.yml)
uv run python scripts/init_tpch_data.py

# With custom connection
uv run python scripts/init_tpch_data.py --host localhost --port 3306 --username test_user --password test_password

# Drop existing tables first (clean re-init)
uv run python scripts/init_tpch_data.py --drop

Test Statistics

  • Unit Tests: 50 tests (config, connector, identifiers)
  • Integration Tests: 31 tests (20 functional + 11 TPC-H)
  • Acceptance Tests: 23 marked tests (subset of unit + integration)
  • Total: 81 tests

Test Markers

Marker Description
integration Requires a running MySQL instance
acceptance Core functionality tests (subset of unit + integration)

Code Structure

datus-mysql/
├── datus_mysql/
│   ├── __init__.py          # Package exports
│   ├── config.py            # MySQLConfig model
│   └── connector.py         # MySQLConnector implementation
├── tests/
│   ├── unit/
│   │   └── ...              # Unit tests (no database required)
│   └── integration/
│       ├── conftest.py      # Fixtures (config, connector, tpch_setup)
│       ├── test_integration.py  # Core integration tests
│       └── test_tpch.py     # TPC-H benchmark tests
├── scripts/
│   └── init_tpch_data.py    # Manual TPC-H data initialization
├── docker-compose.yml       # MySQL 8.0 test container
├── pyproject.toml
└── README.md

Development

Setup

# Install dependencies
uv sync

# Install in editable mode
uv pip install -e .

Running Tests

# Fast unit tests
uv run pytest tests/unit/ -v

# With coverage
uv run pytest tests/ --cov=datus_mysql --cov-report=html
open htmlcov/index.html

Code Quality

# Format code
black datus_mysql tests
isort datus_mysql tests

# Lint
ruff check datus_mysql tests
flake8 datus_mysql tests

Requirements

  • Python >= 3.12
  • MySQL >= 5.7 or MariaDB >= 10.2
  • datus-agent >= 0.3.0
  • datus-sqlalchemy >= 0.1.0
  • pymysql >= 1.0.0

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datus_mysql-0.1.7.tar.gz (20.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

datus_mysql-0.1.7-py3-none-any.whl (9.2 kB view details)

Uploaded Python 3

File details

Details for the file datus_mysql-0.1.7.tar.gz.

File metadata

  • Download URL: datus_mysql-0.1.7.tar.gz
  • Upload date:
  • Size: 20.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for datus_mysql-0.1.7.tar.gz
Algorithm Hash digest
SHA256 cf572e3d6ad979f3d51e9d1bea1bb6d99b1a7021cfd8bfe11d995427abe94f58
MD5 568862e166bf725b6a0fc6c28a63ed99
BLAKE2b-256 fc5cb7f9ef7a4977d17dc18d1c70d8df6403efd4010fe8c4b5873859f19a70db

See more details on using hashes here.

File details

Details for the file datus_mysql-0.1.7-py3-none-any.whl.

File metadata

  • Download URL: datus_mysql-0.1.7-py3-none-any.whl
  • Upload date:
  • Size: 9.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for datus_mysql-0.1.7-py3-none-any.whl
Algorithm Hash digest
SHA256 b654bb6d8e12db609d0b1d6d23c53f835bec00cb8d4df26f4fe298a0156ff3d1
MD5 16bb2e7de5eeb4c43ce5f6fa57696044
BLAKE2b-256 52f0c8c3acb75f7fd472e4af975b040878ad699ed2c8d130e5bb86e05da4e6ec

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page