Skip to main content

A pluggable connector that allows users (admins) to execute SQL, view, and export the results.

Project description

========================
Auctm Database Connector
========================

A python package that provides database connection to Bifrost database, Substrate database, and Decision Science database.

Modules
#########

There are three modules - **bifrost**, **substrate** and **decision_science**.

All the modules provide the ``engine`` for their respective database.

bifrost
********

This module provides the database connection for the Bifrost database. It provides **read-only** access to the database.

The schema for tables in this database can be seen on the `Bifrost Explorer <https://bifrost.squadplatform.com/explorer/play>`__

substrate
**********

This module provides the database connection for the Substrate database. It provides **read-only** access to the database.

decision_science
*****************

This module provides the database connection for the Decision Science database. It provides **read** and **write**
access to the user's schema in the database.

This module provides the database connection for the Decision Science database.

SquadIQ
*****************

This module provides the database connection for the SquadIQ database. It provides **read-only** access to the database.

Usage
------

.. code-block:: python

from database_connector import bifrost, substrate, decision_science, squadiq, sagemaker

bifrost_engine = bifrost.engine
substrate_engine = substrate.engine
ds_engine = decision_science.engine
iq_engine = squadiq.engine
ds_sagemaker_engine = sagemaker.engine

# Database connection is automatically closed when connection is used in a context manager i.e. inside "with"
with bifrost_engine.connect() as conn:
data = conn.execute("Select * from fub_customer")

# data is a dictionary with column name as key and row as value
for row in data:
print(row)

# Database connection is automatically closed when connection is used in a context manager i.e. inside "with"
with substrate_engine.connect() as conn:
data = conn.execute("Select * from substrate_customer")

# data is a dictionary with column name as key and row as value
for row in data:
print(row)

# Database connection is automatically closed when connection is used in a context manager i.e. inside "with"
with iq_engine.connect() as conn:
data = conn.execute("Select * from voice_workflow_campaign")

# data is a dictionary with column name as key and row as value
for row in data:
print(row)

# Database connection is automatically closed when connection is used in a context manager i.e. inside "with"
with ds_sagemaker_engine.connect() as conn:
data = conn.execute("Select * from voice_workflow_campaign")

# data is a dictionary with column name as key and row as value
for row in data:
print(row)

# Database connection is automatically closed when connection is used in a context manager i.e. inside "with"
with ds_engine.connect() as conn:
data = conn.execute("Select * from vedvasu.customers")

# data is a dictionary with column name as key and row as value
for row in data:
print(row)

conn.execute("insert into vedvasu.customers (id, name) values (1, 'Robert-Slack')")


How to install this module inside SageMaker Jobs started from JupyterHub
--------------------------------------------------------------------------

The following commands should be added in the DockerFile to install this module in the docker image. Preferably add these commands near the end before changing the working directory since all the steps these commands will be rebuild and cache won’t be used. Check the release `here <https://discourse.squadstack.com/t/sagemaker-update-launch-training-jobs-inside-vpc/56>`__.

.. code-block:: dockerfile

COPY auctm-database-connector-*.tar.gz /mnt/
RUN pip3 install /mnt/auctm-database-connector-*.tar.gz

UI for the Decision Science Database
-------------------------------------

The UI is available at `Auctm PGAdmin <https://pgadmin.auctm.com/pgadmin4>`__. It provides a MySQL Workbench like interface for our PostgreSQL database.
Users can login here to view and make changes to their tables via a UI. The credentials for this can be taken from the Admins (currently AGB and abkunal)

How to Onboard a new user on Decision Science Database
********************************************************

- Login as the Admin Account at `Auctm PGAdmin <https://pgadmin.auctm.com/pgadmin4>`__.
- Create a **User** account for a new user and share the credentials with the user.
- Run the following SQL queries by replacing ``<role-name>``, ``<strong-password>``, and ``<schema-name>``.
Note down the ``role-name`` and ``strong-password``.

.. code-block:: sql

-- Create a new role and GRANT all_read_only role to this new role
CREATE ROLE <role-name> WITH
LOGIN
NOSUPERUSER
NOCREATEDB
NOCREATEROLE
INHERIT
NOREPLICATION
CONNECTION LIMIT -1
PASSWORD '<strong-password>';

GRANT all_read_only TO <role-name>;

-- Grant this role to the Admin user
GRANT <role-name> TO decision_science;

-- Create schema for the role and make this role owner of this schema
CREATE SCHEMA <schema-name> AUTHORIZATION <role-name>;

-- Grant all_read_only role READ access to this new schema
GRANT USAGE ON SCHEMA <schema-name> TO all_read_only;
GRANT SELECT ON ALL TABLES IN SCHEMA <schema-name> TO all_read_only;

- On JupyterHub, navigate to the user's home directory and create
the credentials file.

.. code-block:: bash

sudo su
cd /home/<user>
nano decision_science_creds.csv

# Eg => cd /home/jupyter-kunal.yadav@squadr-319c0


- Put the role name and password in the csv file as shown below.

.. code-block:: bash

username,password
<role-name>,<strong-password>


- Change the ownership permissions for this file.

.. code-block:: bash

chown <user> decision_science_creds.csv
chgrp <user> decision_science_creds.csv

# Eg => chown jupyter-kunal.yadav@squadr-319c0 decision_science_creds.csv


Documentation
--------------

We use SQL Alchemy as an ORM because it provides additional features than raw `psycopg2`.
You can check out its documentation here - `SQL Alchemy docs <https://docs.sqlalchemy.org/en/13/core/connections.html>`__

How to Modify this Package
----------------------------

1. This package works with Python 3
2. After making the necessary changes run the following commands to build the project and install the package

.. code-block:: bash

python setup.py sdist
sudo -E python -m pip install dist/auctm-database-connector-0.1.tar.gz


3. You can uninstall the package by running the following command - ``sudo -E pip uninstall auctm-database-connector``

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

squad-sagemaker-db-connector-0.2.tar.gz (9.4 kB view details)

Uploaded Source

Built Distribution

squad_sagemaker_db_connector-0.2-py3-none-any.whl (13.7 kB view details)

Uploaded Python 3

File details

Details for the file squad-sagemaker-db-connector-0.2.tar.gz.

File metadata

File hashes

Hashes for squad-sagemaker-db-connector-0.2.tar.gz
Algorithm Hash digest
SHA256 0388c668c47f5841a0170bc87dcc4fd3e019623493fc600cb3ceb75cbf1e3f71
MD5 4d4dc83eaee8bd64790d43d56237ea20
BLAKE2b-256 3f13e4e766328aba7b8b22d499161ae3dc8a3bf1edab5a03ae9288c80ab14d38

See more details on using hashes here.

File details

Details for the file squad_sagemaker_db_connector-0.2-py3-none-any.whl.

File metadata

File hashes

Hashes for squad_sagemaker_db_connector-0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 2b7d467212fbc5fc282b72e71f7dffbe190b5ae225be7b7245140a430bbf7792
MD5 790cd89ec297e7d2231d496908a8915f
BLAKE2b-256 2412bb65dad83f9a45f8bba64340f78c0f24e3a91ac42b86f65deb3f140b1f03

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page