Skip to main content

MySQL database adapter for Datus

Project description

datus-mysql

MySQL database adapter for Datus.

Installation

pip install datus-mysql

This will automatically install the required dependencies:

  • datus-agent
  • datus-sqlalchemy
  • pymysql

Usage

The adapter is automatically registered with Datus when installed. Configure your database connection in your Datus configuration:

database:
  type: mysql
  host: localhost
  port: 3306
  username: root
  password: your_password
  database: your_database

Or use programmatically:

from datus_mysql import MySQLConnector, MySQLConfig

# Using config object
config = MySQLConfig(
    host="localhost",
    port=3306,
    username="root",
    password="your_password",
    database="mydb"
)
connector = MySQLConnector(config)

# Or using dict
connector = MySQLConnector({
    "host": "localhost",
    "port": 3306,
    "username": "root",
    "password": "your_password",
    "database": "mydb"
})

# Test connection
connector.test_connection()

# Execute query
result = connector.execute({"sql_query": "SELECT * FROM users LIMIT 10"})
print(result.sql_return)

# Get table list
tables = connector.get_tables(database_name="mydb")
print(f"Tables: {tables}")

# Get table schema
schema = connector.get_schema(database_name="mydb", table_name="users")
for column in schema:
    print(f"{column['name']}: {column['type']}")

Features

  • Full CRUD operations (SELECT, INSERT, UPDATE, DELETE)
  • DDL execution (CREATE, ALTER, DROP)
  • Metadata retrieval (tables, views, schemas)
  • Sample data extraction
  • Multiple result formats (pandas, arrow, csv, list)
  • Connection pooling and management
  • Comprehensive error handling

Testing

Quick Start

# Unit tests (no database required)
cd datus-mysql
uv run pytest tests/unit/ -v

# All tests with coverage
uv run pytest tests/ -v --cov=datus_mysql --cov-report=term-missing

Integration Tests (Requires MySQL)

# Start MySQL container
cd datus-mysql
docker compose up -d

# Run integration tests
uv run pytest tests/integration/ -m integration -v

# Run TPC-H tests only
uv run pytest tests/integration/test_tpch.py -m integration -v

# Run all acceptance tests (unit + integration)
uv run pytest tests/ -m acceptance -v

# Stop MySQL
docker compose down

TPC-H Test Data

The integration tests include TPC-H benchmark data for comprehensive testing:

Table Rows Description
tpch_region 5 Standard TPC-H regions
tpch_nation 25 Standard TPC-H nations
tpch_customer 10 Simplified customer data
tpch_orders 15 Simplified order data
tpch_supplier 5 Simplified supplier data

The tpch_setup fixture (session-scoped) automatically creates tables, inserts data, and cleans up after tests complete.

Initialize TPC-H Data Manually

You can also initialize TPC-H data manually using the provided script:

cd datus-mysql

# Using defaults (from docker-compose.yml)
uv run python scripts/init_tpch_data.py

# With custom connection
uv run python scripts/init_tpch_data.py --host localhost --port 3306 --username test_user --password test_password

# Drop existing tables first (clean re-init)
uv run python scripts/init_tpch_data.py --drop

Test Statistics

  • Unit Tests: 50 tests (config, connector, identifiers)
  • Integration Tests: 31 tests (20 functional + 11 TPC-H)
  • Acceptance Tests: 23 marked tests (subset of unit + integration)
  • Total: 81 tests

Test Markers

Marker Description
integration Requires a running MySQL instance
acceptance Core functionality tests (subset of unit + integration)

Code Structure

datus-mysql/
├── datus_mysql/
│   ├── __init__.py          # Package exports
│   ├── config.py            # MySQLConfig model
│   └── connector.py         # MySQLConnector implementation
├── tests/
│   ├── unit/
│   │   └── ...              # Unit tests (no database required)
│   └── integration/
│       ├── conftest.py      # Fixtures (config, connector, tpch_setup)
│       ├── test_integration.py  # Core integration tests
│       └── test_tpch.py     # TPC-H benchmark tests
├── scripts/
│   └── init_tpch_data.py    # Manual TPC-H data initialization
├── docker-compose.yml       # MySQL 8.0 test container
├── pyproject.toml
└── README.md

Development

Setup

# Install dependencies
uv sync

# Install in editable mode
uv pip install -e .

Running Tests

# Fast unit tests
uv run pytest tests/unit/ -v

# With coverage
uv run pytest tests/ --cov=datus_mysql --cov-report=html
open htmlcov/index.html

Code Quality

# Format code
black datus_mysql tests
isort datus_mysql tests

# Lint
ruff check datus_mysql tests
flake8 datus_mysql tests

Requirements

  • Python >= 3.12
  • MySQL >= 5.7 or MariaDB >= 10.2
  • datus-agent >= 0.3.0
  • datus-sqlalchemy >= 0.1.0
  • pymysql >= 1.0.0

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datus_mysql-0.1.6.tar.gz (18.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

datus_mysql-0.1.6-py3-none-any.whl (8.3 kB view details)

Uploaded Python 3

File details

Details for the file datus_mysql-0.1.6.tar.gz.

File metadata

  • Download URL: datus_mysql-0.1.6.tar.gz
  • Upload date:
  • Size: 18.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for datus_mysql-0.1.6.tar.gz
Algorithm Hash digest
SHA256 7c498fb77530cab477a6f90e01282cfb4a2573291c02a3f420ac75834b3d3e80
MD5 b2f06a3c499288d28c6d63817ba43236
BLAKE2b-256 0f9c3d6527ff121429a0037a6b39dafe87c94e49dbceea76c3df528c05b94efd

See more details on using hashes here.

File details

Details for the file datus_mysql-0.1.6-py3-none-any.whl.

File metadata

  • Download URL: datus_mysql-0.1.6-py3-none-any.whl
  • Upload date:
  • Size: 8.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for datus_mysql-0.1.6-py3-none-any.whl
Algorithm Hash digest
SHA256 934698449943a28b086321b3212c68985a1fa3a9930ac4c416e4b67ca792d934
MD5 2e03fbf505943348e4be150a4fe85c77
BLAKE2b-256 ab0307baab47ab78547749f9c4938f9a5756da0194ac365967a5bb1f0027455c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page