Skip to main content

MySQL database adapter for Datus

Project description

datus-mysql

MySQL database adapter for Datus.

Installation

pip install datus-mysql

This will automatically install the required dependencies:

  • datus-agent
  • datus-sqlalchemy
  • pymysql

Usage

The adapter is automatically registered with Datus when installed. Configure your database connection in your Datus configuration:

database:
  type: mysql
  host: localhost
  port: 3306
  username: root
  password: your_password
  database: your_database

Or use programmatically:

from datus_mysql import MySQLConnector, MySQLConfig

# Using config object
config = MySQLConfig(
    host="localhost",
    port=3306,
    username="root",
    password="your_password",
    database="mydb"
)
connector = MySQLConnector(config)

# Or using dict
connector = MySQLConnector({
    "host": "localhost",
    "port": 3306,
    "username": "root",
    "password": "your_password",
    "database": "mydb"
})

# Test connection
connector.test_connection()

# Execute query
result = connector.execute({"sql_query": "SELECT * FROM users LIMIT 10"})
print(result.sql_return)

# Get table list
tables = connector.get_tables(database_name="mydb")
print(f"Tables: {tables}")

# Get table schema
schema = connector.get_schema(database_name="mydb", table_name="users")
for column in schema:
    print(f"{column['name']}: {column['type']}")

Features

  • Full CRUD operations (SELECT, INSERT, UPDATE, DELETE)
  • DDL execution (CREATE, ALTER, DROP)
  • Metadata retrieval (tables, views, schemas)
  • Sample data extraction
  • Multiple result formats (pandas, arrow, csv, list)
  • Connection pooling and management
  • Comprehensive error handling

Testing

Quick Start

# Unit tests (no database required)
cd datus-mysql
uv run pytest tests/unit/ -v

# All tests with coverage
uv run pytest tests/ -v --cov=datus_mysql --cov-report=term-missing

Integration Tests (Requires MySQL)

# Start MySQL container
cd datus-mysql
docker compose up -d

# Run integration tests
uv run pytest tests/integration/ -m integration -v

# Run TPC-H tests only
uv run pytest tests/integration/test_tpch.py -m integration -v

# Run all acceptance tests (unit + integration)
uv run pytest tests/ -m acceptance -v

# Stop MySQL
docker compose down

TPC-H Test Data

The integration tests include TPC-H benchmark data for comprehensive testing:

Table Rows Description
tpch_region 5 Standard TPC-H regions
tpch_nation 25 Standard TPC-H nations
tpch_customer 10 Simplified customer data
tpch_orders 15 Simplified order data
tpch_supplier 5 Simplified supplier data

The tpch_setup fixture (session-scoped) automatically creates tables, inserts data, and cleans up after tests complete.

Initialize TPC-H Data Manually

You can also initialize TPC-H data manually using the provided script:

cd datus-mysql

# Using defaults (from docker-compose.yml)
uv run python scripts/init_tpch_data.py

# With custom connection
uv run python scripts/init_tpch_data.py --host localhost --port 3306 --username test_user --password test_password

# Drop existing tables first (clean re-init)
uv run python scripts/init_tpch_data.py --drop

Test Statistics

  • Unit Tests: 50 tests (config, connector, identifiers)
  • Integration Tests: 31 tests (20 functional + 11 TPC-H)
  • Acceptance Tests: 23 marked tests (subset of unit + integration)
  • Total: 81 tests

Test Markers

Marker Description
integration Requires a running MySQL instance
acceptance Core functionality tests (subset of unit + integration)

Code Structure

datus-mysql/
├── datus_mysql/
│   ├── __init__.py          # Package exports
│   ├── config.py            # MySQLConfig model
│   └── connector.py         # MySQLConnector implementation
├── tests/
│   ├── unit/
│   │   └── ...              # Unit tests (no database required)
│   └── integration/
│       ├── conftest.py      # Fixtures (config, connector, tpch_setup)
│       ├── test_integration.py  # Core integration tests
│       └── test_tpch.py     # TPC-H benchmark tests
├── scripts/
│   └── init_tpch_data.py    # Manual TPC-H data initialization
├── docker-compose.yml       # MySQL 8.0 test container
├── pyproject.toml
└── README.md

Development

Setup

# Install dependencies
uv sync

# Install in editable mode
uv pip install -e .

Running Tests

# Fast unit tests
uv run pytest tests/unit/ -v

# With coverage
uv run pytest tests/ --cov=datus_mysql --cov-report=html
open htmlcov/index.html

Code Quality

# Format code
black datus_mysql tests
isort datus_mysql tests

# Lint
ruff check datus_mysql tests
flake8 datus_mysql tests

Requirements

  • Python >= 3.12
  • MySQL >= 5.7 or MariaDB >= 10.2
  • datus-agent >= 0.3.0
  • datus-sqlalchemy >= 0.1.0
  • pymysql >= 1.0.0

License

Apache License 2.0

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

datus_mysql-0.1.7rc2.tar.gz (18.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

datus_mysql-0.1.7rc2-py3-none-any.whl (8.4 kB view details)

Uploaded Python 3

File details

Details for the file datus_mysql-0.1.7rc2.tar.gz.

File metadata

  • Download URL: datus_mysql-0.1.7rc2.tar.gz
  • Upload date:
  • Size: 18.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for datus_mysql-0.1.7rc2.tar.gz
Algorithm Hash digest
SHA256 8e487259726014adc067703c81c5d707d44e71a27d24ee3882fdaffe48f4568b
MD5 34399b75c5c50d5b0cb73f1246e3e271
BLAKE2b-256 4a303aa8a1e63bc242f507912e07252b14b98f062faa0aa3b822c59f5aee194c

See more details on using hashes here.

File details

Details for the file datus_mysql-0.1.7rc2-py3-none-any.whl.

File metadata

  • Download URL: datus_mysql-0.1.7rc2-py3-none-any.whl
  • Upload date:
  • Size: 8.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.12

File hashes

Hashes for datus_mysql-0.1.7rc2-py3-none-any.whl
Algorithm Hash digest
SHA256 7146c1d42da23327aaed71a0e89bb923e5262a29da533a4a8ecc64fb41186017
MD5 51340d4eb2c5816a6ce3242e7cdea9c0
BLAKE2b-256 34ffe0888a9fb1dee40f4d79dd8887e2770359537ec7316a49f67b8337338d7b

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page