Skip to main content

AZQL is a tool for converting data files (CSV, JSON) to SQL DDL (Data Definition Language) scripts.

Project description

AZQL

AZQL is a tool for converting data files (CSV, TSV, JSON) to SQL DDL (Data Definition Language) scripts. It analyzes data files, detects data types, and generates appropriate SQL scripts for table creation.

Features

  • Analyzes data from CSV and JSON files
  • Automatically detects column data types
  • Generates SQL DDL scripts for table creation
  • Supports multiple SQL dialects (including TSQL)
  • Validates data integrity before conversion
  • Customizable SQL formatting options
  • CLI and Python module interfaces

Installation

pip install azql

CLI Usage

AZQL can be used as a command-line tool to convert data files to SQL DDL scripts:

# Convert a single CSV file to SQL DDL
azql convert data.csv --dialect tsql --schema dbo

# Process all supported files in a directory
azql convert ./data_directory ./sql_scripts

# Customize the conversion with options
azql convert users.json \
    --dialect tsql \
    --schema app \
    --sample-size 1000 \
    --drop-table

CLI Options

  • --dialect: SQL dialect to use (default: "tsql")
  • --schema: Database schema name (default: "dbo")
  • --sample-size: Number of rows to sample for type detection (default: 100)
  • --drop-table: Include DROP TABLE statement in DDL
  • --style: SQL formatting style (default: "input" [, "camel", "pascal", "snake"])
  • --skip-validatione: Skip data validation (default: False)

Python Module Usage

AZQL can also be imported and used as a Python module:

from pathlib import Path
import azql

# Convert a single CSV file to a SQL DDL script
ddl = azql.convert("data.csv", dialect="tsql", schema="dbo")
print(ddl)

# Process all supported files in a directory
azql.convert(Path("data_directory"), export=True)

# Customize the conversion with parameters
ddl = azql.convert(
    "users.json",
    validate=True,
    sample_size=1000,
    dialect="tsql",
    schema="app",
    drop_table=True,
    style="default",
    export=False
)

Example Output

IF OBJECT_ID('dbo.users', 'U') IS NOT NULL
    DROP TABLE [dbo].[users];

CREATE TABLE [dbo].[users] (
      [id] INT NOT NULL
    , [name] NVARCHAR(100) NOT NULL
    , [email] NVARCHAR(255) NOT NULL
    , [created_at] DATETIME
    , [is_active] BIT NOT NULL
);

License

GNU General Public License v3.0

Developer Notice

This package was developed with the assistance of GitHub Copilot for documentation and testing purposes. However, every line of code has been manually reviewed and verified by the developers to ensure quality and security.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

azql-0.1.0.tar.gz (40.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

azql-0.1.0-py3-none-any.whl (45.5 kB view details)

Uploaded Python 3

File details

Details for the file azql-0.1.0.tar.gz.

File metadata

  • Download URL: azql-0.1.0.tar.gz
  • Upload date:
  • Size: 40.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.2

File hashes

Hashes for azql-0.1.0.tar.gz
Algorithm Hash digest
SHA256 087ee68489985f009f16f3e51e555642080a0515d70bedef35ce091cb0ad046a
MD5 7b59d2785a0a4b2ff7b778f79c5573de
BLAKE2b-256 f1357d8c3aeb43cdd70abd75db07787bd52e463f36281ebf05033bfd32031d4c

See more details on using hashes here.

File details

Details for the file azql-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: azql-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 45.5 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.6.2

File hashes

Hashes for azql-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 f87e32a2a0c1b2f5ca7078c12ea38bf410170569bf216263e0f56b089c92449a
MD5 97e135f352314495638a08dc5ce728a5
BLAKE2b-256 c182f16eb59ce6eb5da92cf51daad4630255b6e81f4b95a1d0ac4ab2a5f46b62

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page