Skip to main content

A robust SQLite adapter with both synchronous and asynchronous interfaces

Project description

Evolvishub SQLite Adapter

Evolvishub SQLite Adapter Logo

Evolvis AI - Empowering Innovation Through AI

A robust SQLite adapter with both synchronous and asynchronous interfaces, featuring connection pooling, transaction support, and Flyway-style migrations.

Features

  • 🔄 Both synchronous and asynchronous interfaces
  • 🔌 Connection pooling for better performance
  • 🔒 Transaction support with context managers
  • 📦 Flyway-style database migrations
  • 📊 Pandas DataFrame integration
  • 🛡️ Comprehensive error handling
  • 📝 Detailed logging
  • 🧪 Full test coverage

Installation

pip install evolvishub-sqlite-adapter

Quick Start

Synchronous Usage

from evolvishub_sqlite_adapter import SyncSQLiteAdapter, DatabaseConfig

# Configure the database
config = DatabaseConfig(
    database="my_database.db",
    pool_size=5,
    journal_mode="WAL",
    synchronous="NORMAL",
    foreign_keys=True
)

# Create adapter instance
db = SyncSQLiteAdapter(config)
db.connect()

# Execute queries
db.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT)")
db.execute("INSERT INTO users (name) VALUES (?)", ("John Doe",))

# Fetch results
results = db.fetch_all("SELECT * FROM users")

# Use transactions
with db.transaction():
    db.execute("INSERT INTO users (name) VALUES (?)", ("Jane Doe",))
    db.execute("UPDATE users SET name = ? WHERE id = ?", ("John Smith", 1))

# Close connections
db.close()

Asynchronous Usage

import asyncio
from evolvishub_sqlite_adapter import AsyncSQLiteAdapter, DatabaseConfig

async def main():
    # Configure the database
    config = DatabaseConfig(
        database="my_database.db",
        pool_size=5,
        journal_mode="WAL"
    )

    # Create adapter instance
    db = AsyncSQLiteAdapter(config)
    await db.connect()

    # Execute queries
    await db.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT)")
    await db.execute("INSERT INTO users (name) VALUES (?)", ("John Doe",))

    # Fetch results
    results = await db.fetch_all("SELECT * FROM users")

    # Use transactions
    async with db.transaction():
        await db.execute("INSERT INTO users (name) VALUES (?)", ("Jane Doe",))
        await db.execute("UPDATE users SET name = ? WHERE id = ?", ("John Smith", 1))

    # Close connections
    await db.close()

asyncio.run(main())

Database Migrations

The package includes a Flyway-style migration system. Create your migration files in a directory with the naming pattern V{version}__{description}.sql.

Migration File Structure

migrations/
├── V1__create_users_table.sql
├── V2__add_email_column.sql
└── V3__create_posts_table.sql

Example migration file (V1__create_users_table.sql):

CREATE TABLE users (
    id INTEGER PRIMARY KEY,
    name TEXT NOT NULL,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

Using Migrations

from evolvishub_sqlite_adapter import SyncSQLiteAdapter, DatabaseConfig
from evolvishub_sqlite_adapter.migrations import FlywayMigration

# Configure and connect to database
config = DatabaseConfig(database="my_database.db")
db = SyncSQLiteAdapter(config)
db.connect()

# Initialize migrations
migrations = FlywayMigration("migrations")

# Run migrations
with db.transaction():
    migrations.migrate(db._get_connection())

# Get migration info
migration_info = migrations.info(db._get_connection())
for info in migration_info:
    print(f"Version {info['version']}: {info['description']}")

db.close()

Configuration

The DatabaseConfig class supports various SQLite configuration options:

config = DatabaseConfig(
    database="my_database.db",      # Database file path
    pool_size=5,                    # Connection pool size
    journal_mode="WAL",            # Journal mode (WAL, DELETE, TRUNCATE, etc.)
    synchronous="NORMAL",          # Synchronous mode
    foreign_keys=True,             # Enable foreign key constraints
    check_same_thread=False,       # Allow connections from different threads
    cache_size=2000,              # SQLite cache size in pages
    temp_store="MEMORY",          # Temporary storage mode
    page_size=4096,               # Page size in bytes
    log_level="INFO",             # Logging level
    log_file="sqlite.log"         # Optional log file path
)

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

Evolvis AI - info@evolvis.ai

Project Link: https://github.com/evolvis/evolvishub-sqlite-adapter

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

evolvishub_sqlite_adapter-0.1.2.tar.gz (18.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

evolvishub_sqlite_adapter-0.1.2-py3-none-any.whl (15.9 kB view details)

Uploaded Python 3

File details

Details for the file evolvishub_sqlite_adapter-0.1.2.tar.gz.

File metadata

File hashes

Hashes for evolvishub_sqlite_adapter-0.1.2.tar.gz
Algorithm Hash digest
SHA256 27982293fd50f4382e697b8de663627ad1dc3d3a02fc4678b438900bb4b65d75
MD5 c198c14ed0f2fe85e6b3edfd3c5d7fab
BLAKE2b-256 11a1f35f8518783e9a21cc20a8210c7ddc4dd127c4112ae03b2267b16e9572cd

See more details on using hashes here.

File details

Details for the file evolvishub_sqlite_adapter-0.1.2-py3-none-any.whl.

File metadata

File hashes

Hashes for evolvishub_sqlite_adapter-0.1.2-py3-none-any.whl
Algorithm Hash digest
SHA256 70ed0cbbd618d4e661c486fcf05227eacd231b577c1d106f834b3a402ffde9f9
MD5 de0e14988b85e275756fef0aed187dcd
BLAKE2b-256 cf09a1a401ccd64d89cf2347e0f630920958f8599be3515441507a6834a38186

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page