Skip to main content

A robust SQLite adapter with both synchronous and asynchronous interfaces

Project description

Evolvishub SQLite Adapter

Evolvishub SQLite Adapter Logo

Evolvis AI - Empowering Innovation Through AI

A robust SQLite adapter with both synchronous and asynchronous interfaces, featuring connection pooling, transaction support, and Flyway-style migrations.

Features

  • 🔄 Both synchronous and asynchronous interfaces
  • 🔌 Connection pooling for better performance
  • 🔒 Transaction support with context managers
  • 📦 Flyway-style database migrations
  • 📊 Pandas DataFrame integration
  • 🛡️ Comprehensive error handling
  • 📝 Detailed logging
  • 🧪 Full test coverage

Installation

pip install evolvishub-sqlite-adapter

Quick Start

Synchronous Usage

from evolvishub_sqlite_adapter import SyncSQLiteAdapter, DatabaseConfig

# Configure the database
config = DatabaseConfig(
    database="my_database.db",
    pool_size=5,
    journal_mode="WAL",
    synchronous="NORMAL",
    foreign_keys=True
)

# Create adapter instance
db = SyncSQLiteAdapter(config)
db.connect()

# Execute queries
db.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT)")
db.execute("INSERT INTO users (name) VALUES (?)", ("John Doe",))

# Fetch results
results = db.fetch_all("SELECT * FROM users")

# Use transactions
with db.transaction():
    db.execute("INSERT INTO users (name) VALUES (?)", ("Jane Doe",))
    db.execute("UPDATE users SET name = ? WHERE id = ?", ("John Smith", 1))

# Close connections
db.close()

Asynchronous Usage

import asyncio
from evolvishub_sqlite_adapter import AsyncSQLiteAdapter, DatabaseConfig

async def main():
    # Configure the database
    config = DatabaseConfig(
        database="my_database.db",
        pool_size=5,
        journal_mode="WAL"
    )

    # Create adapter instance
    db = AsyncSQLiteAdapter(config)
    await db.connect()

    # Execute queries
    await db.execute("CREATE TABLE users (id INTEGER PRIMARY KEY, name TEXT)")
    await db.execute("INSERT INTO users (name) VALUES (?)", ("John Doe",))

    # Fetch results
    results = await db.fetch_all("SELECT * FROM users")

    # Use transactions
    async with db.transaction():
        await db.execute("INSERT INTO users (name) VALUES (?)", ("Jane Doe",))
        await db.execute("UPDATE users SET name = ? WHERE id = ?", ("John Smith", 1))

    # Close connections
    await db.close()

asyncio.run(main())

Database Migrations

The package includes a Flyway-style migration system. Create your migration files in a directory with the naming pattern V{version}__{description}.sql.

Migration File Structure

migrations/
├── V1__create_users_table.sql
├── V2__add_email_column.sql
└── V3__create_posts_table.sql

Example migration file (V1__create_users_table.sql):

CREATE TABLE users (
    id INTEGER PRIMARY KEY,
    name TEXT NOT NULL,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

Using Migrations

from evolvishub_sqlite_adapter import SyncSQLiteAdapter, DatabaseConfig
from evolvishub_sqlite_adapter.migrations import FlywayMigration

# Configure and connect to database
config = DatabaseConfig(database="my_database.db")
db = SyncSQLiteAdapter(config)
db.connect()

# Initialize migrations
migrations = FlywayMigration("migrations")

# Run migrations
with db.transaction():
    migrations.migrate(db._get_connection())

# Get migration info
migration_info = migrations.info(db._get_connection())
for info in migration_info:
    print(f"Version {info['version']}: {info['description']}")

db.close()

Configuration

The DatabaseConfig class supports various SQLite configuration options:

config = DatabaseConfig(
    database="my_database.db",      # Database file path
    pool_size=5,                    # Connection pool size
    journal_mode="WAL",            # Journal mode (WAL, DELETE, TRUNCATE, etc.)
    synchronous="NORMAL",          # Synchronous mode
    foreign_keys=True,             # Enable foreign key constraints
    check_same_thread=False,       # Allow connections from different threads
    cache_size=2000,              # SQLite cache size in pages
    temp_store="MEMORY",          # Temporary storage mode
    page_size=4096,               # Page size in bytes
    log_level="INFO",             # Logging level
    log_file="sqlite.log"         # Optional log file path
)

Contributing

  1. Fork the repository
  2. Create your feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add some amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

Evolvis AI - info@evolvis.ai

Project Link: https://github.com/evolvis/evolvishub-sqlite-adapter

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

evolvishub_sqlite_adapter-0.1.1.tar.gz (17.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

evolvishub_sqlite_adapter-0.1.1-py3-none-any.whl (15.2 kB view details)

Uploaded Python 3

File details

Details for the file evolvishub_sqlite_adapter-0.1.1.tar.gz.

File metadata

File hashes

Hashes for evolvishub_sqlite_adapter-0.1.1.tar.gz
Algorithm Hash digest
SHA256 82063c1bea3c790cb96c6ebebd276f17ff5c2e921b3e7ae757abf7955816f6b1
MD5 53bd491d73a62dddd7e3678cf4a58c8c
BLAKE2b-256 8d46784fed695db3f0b8bdfafcecc3381b4ac380339afdafbb06dd67d4325e00

See more details on using hashes here.

File details

Details for the file evolvishub_sqlite_adapter-0.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for evolvishub_sqlite_adapter-0.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 8a578eab03d382a947ddf68761ceb08369061652cb8fd3dd9a9ba58c67c048ca
MD5 d015dee4171f80eb50bf205b3eda2eb6
BLAKE2b-256 8cb53591a60203c28c84d37e714c6a604b52094f40582a29c41b67caef3fa00a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page