Skip to main content

A robust SQLite adapter with both synchronous and asynchronous interfaces

Project description

Evolvishub SQLite Adapter

Evolvishub SQLite Adapter Logo

Evolvis AI - Empowering Innovation Through AI

A robust SQLite adapter with both synchronous and asynchronous interfaces, featuring connection pooling, transaction support, and Flyway-style migrations.

Features

  • 🔄 Both synchronous and asynchronous interfaces
  • 🔌 Connection pooling for better performance
  • 🔒 Transaction support with context managers
  • 📦 Flyway-style database migrations
  • 📊 Pandas DataFrame integration
  • 🛡️ Comprehensive error handling
  • 📝 Detailed logging
  • 🧪 Full test coverage

Installation

pip install evolvishub-sqlite-adapter

Quick Start

Synchronous Usage

from evolvishub_sqlite_adapter import SyncSQLiteAdapter, DatabaseConfig

# Configure the database
config = DatabaseConfig(
    database="my_database.db",
    pool_size=5,
    journal_mode="WAL",
    synchronous="NORMAL",
    foreign_keys=True
)

# Create adapter instance
db = SyncSQLiteAdapter(config)
db.connect()

# Execute queries
db.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT)")
db.execute("INSERT INTO users (name) VALUES (?)", ("John Doe",))

# Fetch results
results = db.fetch_all("SELECT * FROM users")

# Use transactions
with db.transaction():
    db.execute("INSERT INTO users (name) VALUES (?)", ("Jane Doe",))
    db.execute("UPDATE users SET name = ? WHERE id = ?", ("John Smith", 1))

# Close connections
db.close()

Asynchronous Usage

import asyncio
from evolvishub_sqlite_adapter import AsyncSQLiteAdapter, DatabaseConfig

async def main():
    # Configure the database
    config = DatabaseConfig(
        database="my_database.db",
        pool_size=5,
        journal_mode="WAL"
    )

    # Create adapter instance
    db = AsyncSQLiteAdapter(config)
    await db.connect()

    # Execute queries
    await db.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT)")
    await db.execute("INSERT INTO users (name) VALUES (?)", ("John Doe",))

    # Fetch results
    results = await db.fetch_all("SELECT * FROM users")

    # Use transactions
    async with db.transaction():
        await db.execute("INSERT INTO users (name) VALUES (?)", ("Jane Doe",))
        await db.execute("UPDATE users SET name = ? WHERE id = ?", ("John Smith", 1))

    # Close connections
    await db.close()

asyncio.run(main())

Database Migrations

The package includes a Flyway-style migration system. Create your migration files in a directory with the naming pattern V{version}__{description}.sql.

Migration File Structure

migrations/
├── V1__create_users_table.sql
├── V2__add_email_column.sql
└── V3__create_posts_table.sql

Example migration file (V1__create_users_table.sql):

CREATE TABLE users (
    id INTEGER PRIMARY KEY,
    name TEXT NOT NULL,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

Using Migrations

from evolvishub_sqlite_adapter import SyncSQLiteAdapter, DatabaseConfig
from evolvishub_sqlite_adapter.migrations import FlywayMigration

# Configure and connect to database
config = DatabaseConfig(database="my_database.db")
db = SyncSQLiteAdapter(config)
db.connect()

# Initialize migrations
migrations = FlywayMigration("migrations")

# Run migrations
with db.transaction():
    migrations.migrate(db._get_connection())

# Get migration info
migration_info = migrations.info(db._get_connection())
for info in migration_info:
    print(f"Version {info['version']}: {info['description']}")

db.close()

Configuration

The DatabaseConfig class supports various SQLite configuration options:

config = DatabaseConfig(
    database="my_database.db",      # Database file path
    pool_size=5,                    # Connection pool size
    journal_mode="WAL",            # Journal mode (WAL, DELETE, TRUNCATE, etc.)
    synchronous="NORMAL",          # Synchronous mode
    foreign_keys=True,             # Enable foreign key constraints
    check_same_thread=False,       # Allow connections from different threads
    cache_size=2000,              # SQLite cache size in pages
    temp_store="MEMORY",          # Temporary storage mode
    page_size=4096,               # Page size in bytes
    log_level="INFO",             # Logging level
    log_file="sqlite.log"         # Optional log file path
)

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

Evolvis AI - info@evolvis.ai

Project Link: https://github.com/evolvis/evolvishub-sqlite-adapter

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

evolvishub_sqlite_adapter-0.1.4.tar.gz (21.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

evolvishub_sqlite_adapter-0.1.4-py3-none-any.whl (16.8 kB view details)

Uploaded Python 3

File details

Details for the file evolvishub_sqlite_adapter-0.1.4.tar.gz.

File metadata

File hashes

Hashes for evolvishub_sqlite_adapter-0.1.4.tar.gz
Algorithm Hash digest
SHA256 ec995291d136c8fcbef9aac5db2e318abb51cb7347d73c6c6c7838b640be4f19
MD5 308432cb85120e7280f972e6f7d47d24
BLAKE2b-256 7a049cbc1bd9bca99429a1f8ae9a52eb682394314eaa2b307e532accedf9a0b6

See more details on using hashes here.

File details

Details for the file evolvishub_sqlite_adapter-0.1.4-py3-none-any.whl.

File metadata

File hashes

Hashes for evolvishub_sqlite_adapter-0.1.4-py3-none-any.whl
Algorithm Hash digest
SHA256 a4ba05feb7e403cd36f1afdf9405069dfacfdf96897ceb649aac6c58aa7a1fdc
MD5 283f9b203b3b77c9151ac395e3206886
BLAKE2b-256 f9f035d8050af471bec133e5080e23e14dd0a5c889be70ce70e0500aa0702b29

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page