Skip to main content

Repository Scanner backend components

Project description

Repository Scanner Backend (RESC-Backend)

Table of contents

  1. About the component
  2. Getting started
  3. Testing
  4. Create a migration for database changes
  5. Documentation

About the component

The RESC-backend component includes database models, RESC Web service, Alembic scripts for database migration, RabbitMQ users, and queue creation.

Getting started

These instructions will help you to get a copy of the project up and running on your local machine for development and testing purposes.

Prerequisites

Run RESC Web service locally

Run RESC Web service locally from source

Preview Ensure resc database is up and running locally.
You can connect RESC web service to database, if you have already deployed RESC through helm in Kubernetes.
Open the Git Bash terminal from /components/resc-backend folder and run below commands.

Create virtual environment:

pip install virtualenv
virtualenv venv
source venv/Scripts/activate

Install resc_backend package:

pip install pyodbc==4.0.32
pip install -e .

Set environment variables:

source db.env
export MSSQL_SCHEMA=master
export MSSQL_DB_PORT=30880
export MSSQL_PASSWORD="<enter password for local database>"

Run Web service:

uvicorn resc_backend.resc_web_service.api:app --workers 1

Open http://127.0.0.1:8000 in a browser to access the API.

Run RESC Web service locally through make

Note: This procedure has ben only tested in Linux and Mac. It may not work in machines running the Apple M1 chip due to lack of support from MSSQL docker image.

Prerequisites:

  • Install Make on your system.
  • Update MSSQL_PASSWORD (password you want to set for local database) in db.env file.
Preview
  1. Create Python virtual environment and install resc_backend package:
make env
  1. Run database locally:
make db

This target will run a local MSSQL instance in a container called resc-db. It creates and populates the resc database schema using alembic and the sql script located in test_data/database_dummy_data.sql

Note:: This target will also try to remove the DB container if it already exists.

If you want to remove this container, run: make cleandb

  1. Run Web service:
make rws

Open http://127.0.0.1:1234 in a browser to access the API.

  1. Clean up:
make clean

Run locally using docker

Preview Run the RESC-Backend docker image locally with the following commands:
  • Pull the docker image from registry:
docker pull rescabnamro/resc-backend:1.0.0
  • Alternatively, build the docker image locally by running following command: Ensure resc database is up and running locally.
    You can connect RESC web service to database, if you have already deployed RESC through helm in Kubernetes.

    Open the Git Bash terminal from /components/resc-backend folder and run below commands.
    Update MSSQL_PASSWORD value in the docker run command.

docker build -t rescabnamro/resc-backend:1.0.0 .
  • Use the following command to run the RESC backend:
source db.env
docker run -p 8000:8000 -e DB_CONNECTION_STRING -e MSSQL_ODBC_DRIVER -e MSSQL_USERNAME -e AUTHENTICATION_REQUIRED -e MSSQL_DB_HOST="host.docker.internal" -e MSSQL_PASSWORD="<enter password for local database>" -e MSSQL_SCHEMA="master" -e MSSQL_DB_PORT=30880 --name resc-backend rescabnamro/resc-backend:1.0.0 uvicorn resc_backend.resc_web_service.api:app --workers 1 --host 0.0.0.0 --port 8000

Open http://127.0.0.1:8000 in a browser to access the API.

Testing

(Back to top)

See below commands for running various (unit/linting) tests locally. To run these tests you need to install tox. This can be done on Linux and Windows with Git Bash.

Run below commands to make sure that the unit tests are running and that the code matches quality standards:

pip install tox      # install tox locally

tox -v -e sort       # Run this command to validate the import sorting
tox -v -e lint       # Run this command to lint the code according to this repository's standard
tox -v -e pytest     # Run this command to run the unit tests
tox -v               # Run this command to run all of the above tests

Create a migration for database changes

(Back to top)

Use Alembic to create a new migration script

Preview This command will create a new revision script in the ./alembic/versions directory
alembic revision -m "<revision summary>"

The filename is prefixed with the revision identifier used by Alembic to keep track of the revision history. Make sure that the down_revision variable contains the identifier of the previous revision. For instance:

#d330d086edfe_first_revision.py
revision = 'd330d086edfe'
down_revision = None
...

#e653f899efgh_second_revision.py
revision = 'e653f899efgh'
down_revision = 'd330d086edfe'

The generated script contains two functions:

  • The upgrade function that contains the revision changes.
  • The downgrade function that revert these changes.
 

Use the --autogenerate parameter

Preview Alembic provide an --autogenerate parameter to help revision scripts creation. It can output the necessary changes to apply, by comparing the current database schema and the model stated in Python. To create that revision make sure you have a connection to a running database with a up-to-date schema version.
alembic revision --autogenerate -m "<revision summary>"

Note: Autogenerate cannot detect all the required changes.The created revision script must be carefully checked and tested.

 

Running migration and rollback

Preview To upgrade/downgrade the database schema use the following:
# Upgrade to specified revision identifier
alembic upgrade <revision_identifier>

# Upgarde to latest
alembic upgrade head

# Upgrade to the next revision
alembic upgrade +1

# Run next revision from a specific revision
alembic upgrade <revision_identifier>+1

# Downgrade to base (no revision applied)
alembic downgrade base

# Downgrade to the previous revision
alembic downgrade -1

Note: A list of needed changes and a table containing alembic revision history are created during the first revision.

You can also check current revision information:

alembic current

And the revision history:

alembic history --verbose
 

Documentation

(Back to top)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

resc_backend-1.0.0.tar.gz (51.2 kB view details)

Uploaded Source

Built Distribution

resc_backend-1.0.0-py3-none-any.whl (81.3 kB view details)

Uploaded Python 3

File details

Details for the file resc_backend-1.0.0.tar.gz.

File metadata

  • Download URL: resc_backend-1.0.0.tar.gz
  • Upload date:
  • Size: 51.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.11.0

File hashes

Hashes for resc_backend-1.0.0.tar.gz
Algorithm Hash digest
SHA256 f441e9afc20644fdd2d37960949462e9346a9e11aee2611468ee7c38dc017678
MD5 b7530fc39643804c5d8ea1e13faa45ee
BLAKE2b-256 08cbc915e66096eb8d3add09813e02b51734ce0ed3cba51339165dd21794b4b6

See more details on using hashes here.

File details

Details for the file resc_backend-1.0.0-py3-none-any.whl.

File metadata

File hashes

Hashes for resc_backend-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 56d82a7ddc44c012c054a1d51e79c9a35706e4efe18e75c7a8387ebeb77e5c60
MD5 3744983e1a761c0d99e406752fa25512
BLAKE2b-256 7fecff2034ab1eccf9f5a6527c0d253d101171cf19a50d1103d6403f5109c7a7

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page