Skip to main content

database layer for my personal server.

Project description

lapa_database

about

database layer for my personal server.

installation

pip install lapa_database[all]

usage (WIP)

change password in config.ini.

CREATE_SCHEMA = True to create database from scratch.

LOG_FILE_NAME and configure logger

link to lapa_database_structure

config

lapa_database\data\config.ini

env

  • python>=3.12.0

changelog

v0.0.11

  • add square_logger section in environment and initialise logger as per those variables.
  • keep logger and lapa_database_structure as >= instead of ~=.
  • update minimum version for lapa_commons.

v0.0.10

  • removed import for main file from config.
  • removed validation for table names in crud call in pydantic models.
  • adjust all crud calls to account for pydantic model change from enum to str.

v0.0.9

  • removed config.example.ini.
  • bug fix: create_database -> data insertion now takes schema into account.

v0.0.8

  • add module name in config.
  • add first test case.
  • add dependencies for testing.
  • add GitHub workflow for testing.
  • remove config from gitignore.

v0.0.7

  • use lapa_commons to read config.

v0.0.6

  • change default value of ignore_filters_and_get_all to False.

v0.0.5

  • make crud logic default to no rows when filters are empty.
  • add new parameters to make it easy to select all rows for edit, delete and get.
  • move logger to configuration.py to fix bug of multiple logs being created.

v0.0.4

  • rename to lapa database.
  • fix bug in create_database that occurred in default data insertion.
  • add logs to gitignore.
  • change psycopg2 to psycopg2-binary in setup.py.

v0.0.3

  • created utils folder containing CommonOperations.py under which the common functions used across modules are stored.
  • web_socket implemented for retrieving the data from the table when a new row is added/deleted/updated.
    • it takes database_name, table_name and schema_name as input through query params.
    • input for websocket
      • /ws/<database_name>/<table_name>/<schema_name>
        • E.g. /ws/game/player/public
    • initially returns all the rows and if any update is made it returns the updated data.
    • trigger creation is implemented once the websocket connection is made. it will first check if the trigger function already exists or not and then only create.

v0.0.2

  • remove databases folder and enums related to tables and put in separate module for better version control.
  • add proper error message display on errors in configuration.py.
  • known bugs:
    • creating engines everytime on fastapi route call is creating idle sessions.

v0.0.1

  • initial implementation.
  • known bugs:
    • creating engines everytime on fastapi route call is creating idle sessions.

Feedback is appreciated. Thank you!

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lapa_database-0.0.11.tar.gz (11.7 kB view details)

Uploaded Source

Built Distribution

lapa_database-0.0.11-py3-none-any.whl (13.7 kB view details)

Uploaded Python 3

File details

Details for the file lapa_database-0.0.11.tar.gz.

File metadata

  • Download URL: lapa_database-0.0.11.tar.gz
  • Upload date:
  • Size: 11.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.9.19

File hashes

Hashes for lapa_database-0.0.11.tar.gz
Algorithm Hash digest
SHA256 2d18087627475db477a5e304d675bb849537b369b629ceeb921127fee488d3e4
MD5 1b28e6d25acaf5b41ca5122fa379cea0
BLAKE2b-256 36f9d2e8620c1488a80ca6570c3bc780559505de28f9a3fea98024e7af6e7008

See more details on using hashes here.

File details

Details for the file lapa_database-0.0.11-py3-none-any.whl.

File metadata

File hashes

Hashes for lapa_database-0.0.11-py3-none-any.whl
Algorithm Hash digest
SHA256 bbeb9d0a8d0a33f7687d4ca269a198f126a16bd5467db105f9f5d11fad398eee
MD5 47ddad14d5a500ef343da60878e58643
BLAKE2b-256 5a387c9a086ea5d476d7c5d65b0ac7b01846e5f7f84d9e2d3258c7043983327d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page