Skip to main content

Database Factory;

Project description

database-factory

License: MIT PyPI CircleCI

Database factory is used to manage/create database connection with execute queries using the connection. The concept of having single source to connect various databases and perform database operations.

User need not to worry on the crafting the connection string and to identify the methods for the database operations. Database factory supports DML / DDL executions and have support of Pandas DataFrame to create or replace existing tables.

Database factory is wrapper on sqlalchemy for crafting the connection and supports below databases:

* Sqlite3
* PostgreSQl
* BigQuery
* Snowflake
* MariaDB
* MySQL

Database factory can be enhanced for all the sqlalchemy supported database.

Getting Started

pip install database-factory

Note: Default installation for database factory is to support Sqlite3. For other database/cloud support it can be installed with compinations of extra libraries

Sqite3 with AWS cloud support

pip install database-factory["aws"]

Snowflake with AWS cloud support

pip install database-factory["snowflake,aws"]

Following options are supported

  • Secret manager cloud support
    • aws
    • gcp
  • Databases
    • snowflake
    • postgres
    • mysql
  • all: Will install with libraries of all supported cloud and supported databases.

Using database-factory


from database_factory.manager import DatabaseManager
import tempfile
temp_dir = tempfile.gettempdir()
db = DatabaseManager(engine_type="sqlite", database="test_db", sqlite_db_path=temp_dir)
db.create_session()

db.execute_sql(sql="create table test (id int PRIMARY KEY)")
db.execute_sql(sql="insert into test values (1)")
db.execute_sql(sql="insert into test values (2)")

rows = db.execute_sql(sql="select * from test")
if rows:
  print(rows)


df = db.get_df(sql="select * from test")
print(df)

db.execute_df(panda_df=df, table_name=copy_test, exist_action="replace")
# db.execute_df(panda_df=df, table_name=copy_test, exist_action="replace", chunk_size=100)
db.execute_sql(sql="insert into copy_test values (3)")
rows_copy = db.execute_sql(sql="select * from copy_test")
if rows_copy:
  print(rows_copy)

Appendix

Supported database type:


*   sqlite `default`
*   postgres
*   mysql
*   mariadb
*   snowflake

Connection parameters for sqlite:


* engine_type: sqlite
* database: <name of database>
* sqlite_db_path: <path where database will be created>

Connection parameters for postgres:


* engine_type: postgres
* database: <name of database>
* username: <postgres user>
* password: <user password>
* host: <host of postgres service>
* port: <port of postgres service>

Connection parameters for mysql:


* engine_type: mysql
* database: <name of database>
* username: <mysql user>
* password: <user password>
* host: <host of mysql service>
* port: <port of mysql servic\>

Connection parameters for mariadb:


* engine_type: mariadb
* database: <name of database>
* username: <mariadb user>
* password: <user password>
* host: <host of mariadb service>
* port: <port of mariadb service>

Connection parameters for snowflake:


* engine_type: snowflake
* database: <name of database>
* username: <snowflake user>
* password: <user password>
* schema: <schema name>
* snowflake_role: <snowflake role>
* snowflake_warehouse: <snowflake warehouse>
* snowflake_account: <snowflake account>

Connection parameters for bigquery:


* engine_type: bigquery
* database: <name of database>

Getting connection properties from AWS / GCP Secret Manager Service:


Note:

  • GCP:
    • On Cloud Server:
      • Set server to execute the all cloud api services
      • Attach following permissions
        • Project Viewer
        • Secret Manager Secret Accessor
    • On Premises:
      • Attach following permissions to user service account and download service account file for authentication:
        • Project Viewer
        • Secret Manager Secret Accessor
      • Set environment variable "GOOGLE_APPLICATION_CREDENTIALS" pointing to service account file.
  • AWS:
    • On Cloud Server:
      • Set execution profile with "secretsmanager:GetSecretValue" policy
    • On Premises:
      • AWS should be configured
      • User should have permissions of "secretsmanager:GetSecretValue" policy.
* engine_type: bigquery
* database: <name of database>
* secret_id: <Secret name of AWS / GCP Secret Manager Service>
* secrete_manager_cloud: <aws or gcp as per cloud>
* aws_region: <aws region: default=> us-east-1>

Development Setup

Using virtualenv

python3 -m venv venv
source env/bin/activate
pip install .

Contributing

  1. Fork repo- https://github.com/shrivastava-v-ankit/database-factory.git
  2. Create your feature branch - git checkout -b feature/name
  3. Install Python packages
    • sqlalchemy==1.4.47
    • pandas==1.5.3
    • GitPython
    • coverage==7.2.3
    • exceptiongroup==1.1.1
    • iniconfig==2.0.0
    • pluggy==1.0.0
    • pytest==7.3.0
    • pytest-cov==4.0.0
    • tomli==2.0.1
  4. Run Python test (pytest)
    • pytest -v --cov --cov-report html --cov-report xml --junitxml=test-results/database_factory_test/results.xml
  5. Add Python test (pytest) and covrage report for new/changed feature.
  6. Commit your changes - git commit -am "Added name"
  7. Push to the branch - git push origin feature/name
  8. Create a new pull request

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

database-factory-1.1.1.tar.gz (18.1 kB view details)

Uploaded Source

Built Distribution

database_factory-1.1.1-py3-none-any.whl (20.3 kB view details)

Uploaded Python 3

File details

Details for the file database-factory-1.1.1.tar.gz.

File metadata

  • Download URL: database-factory-1.1.1.tar.gz
  • Upload date:
  • Size: 18.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.10.12

File hashes

Hashes for database-factory-1.1.1.tar.gz
Algorithm Hash digest
SHA256 985a631a70a8dd70d8ccedb47ed90850452eae4f65808e2dc1ec22d936989c11
MD5 9a768ecbf542f7b2c8def33eef2d5f19
BLAKE2b-256 98e4dc28a62ded8a5d2a49cffd6b1f6674d4756aafcbcae5d53e122661b97766

See more details on using hashes here.

File details

Details for the file database_factory-1.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for database_factory-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 e6563fc49ca26c8c731b8b2e3a48b72749d6984b57a5f1c0a2955629361a3a4b
MD5 03ed5bc711e1d57924f41e58d2322416
BLAKE2b-256 d54ef503e5eca7917171a13c1a6f3b2505da9c39927bd6ccac561cdac8d3aeb0

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page