Skip to main content

A package for commonly used functions

Project description

PYCOF (PYthon COmmon Functions)

1. Installation

You can get pycof from PyPI with:

pip install pycof

The library is supported on Windows, Linux and MacOs.

2. Usage

2.1 Config file for credentials

2.1.1 Save your credentials locally

The function remote_execute_sql will by default look for the credentials located in /etc/config.json. On Windows, save the config file as C:/Windows/config.json.

The file follows the below structure:

{
	"DB_USER": "",
	"DB_PASSWORD": "",
	"DB_HOST": "",
	"DB_PORT": "",
	"DB_DATABASE": "",
	"__COMMENT_1__": "__ IAM specific, if useIAM=True __",
	"CLUSTER_NAME": "",
	"AWS_ACCESS_KEY_ID": "",
	"AWS_SECRET_ACCESS_KEY": "",
	"REGION": ""
}

On Unix based server, run:

sudo nano /etc/config.json

and paste the above json after filling the empty strings.

Reminder: To save the file, with nano press CTRL + O and y then CTRL + X to exit.

On Windows, use the path C:/Windows/config.json.

Pass your credentials in you code

Though it is highly not recommended, you can pass your credentials locally to the remote_execute_sql with the argument credentials. You can then create a dictionnary using the same keys as described in previous section.

2.2 Load pycof

To load pycof in your script, you can use:

# Load pycof
import pycof as pc
# Or, load a specific or all functions from pycof
from pycof import *

To execute an SQL query, follow the below steps:

from pycof import remote_execute_sql

## Set up the SQL query
sql = "SELECT * FROM SCHEMA.TABLE LIMIT 10"

## The function will return a pandas dataframe
remote_execute_sql(sql)

2.3 Available functions

The current version of the library provides:

  • verbose_display: extended function for print that can print strings, lists, data frames and uses tqdm is used in for loops.
  • remote_execute_sql: aggragated function for SQL queries to SELECT, INSERT or DELETE.
  • add_zero: simple function to convert int to str by adding a 0 is less than 10.
  • OneHotEncoding: perform One Hot Encoding on a dataframe for the provided column names. Will keep the original categorical variables if drop is set to False.
  • create_dataset: function to format a Pandas dataframe for keras format for LSTM.
  • group: will convert an int to a str with thousand seperator.
  • replace_zero: will transform 0 values to - for display purposes.
  • week_sunday: will return week number of last sunday date of a given date.
  • display_name: displays the current user name. Will display either first, last or full name.
  • write: writes a str to a specific file (usually .txt) in one line of code.
  • str2bool: converts string to boolean.
  • wmape: computes the Weighted Mean Absolute Percentage Error between two columns.
  • mse: computes the Mean Squared Error between two columns. Returns the RMSE (Root MSE) if root is set to True.

3. FAQ

3.1. How to use multiple credentials for remote_execute_sql?

The credentials argument can take the path or json file name into account to load them. You can have multiple credential files such as /etc/config.json, /etc/MyNewHost.json and /home/OtherHost.json. In remote_execute_sql you can play with the arguments.

  • To use the /etc/config.json credentials you can use the default arguments by not providing anything.
  • To use /etc/MyNewHost.json you can either pass MyNewHost.json or the whole path to use them.
  • To use /home/OtherHost.json you need to pass the whole path.

3.2. Can I query a Reshift cluster with IAM user credentials?

The function remote_execute_sql can take into account IAM user's credentials. You need to ensure that your credentials file /etc/config.json includes the IAM access and secret keys with the Redshift cluster information. The only argument to change when calling the function is to set useIAM=True.

The function will then use the AWS access and secret keys to ask AWS to provide the user name and password to connect to the cluster. This is a much safer approach to connect to a Redshift cluster than using direct cluster's credentials.

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pycof-1.1.11.tar.gz (9.7 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

pycof-1.1.11-py3-none-any.whl (10.7 kB view details)

Uploaded Python 3

File details

Details for the file pycof-1.1.11.tar.gz.

File metadata

  • Download URL: pycof-1.1.11.tar.gz
  • Upload date:
  • Size: 9.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.8.0 tqdm/4.35.0 CPython/3.7.4

File hashes

Hashes for pycof-1.1.11.tar.gz
Algorithm Hash digest
SHA256 2fb5cb461d213a2eab9986e7861c94dd3bbc9e4f114721bc6fdeff96f337c677
MD5 25abcf0577b5d1db1be01a6cedf7ea3e
BLAKE2b-256 6cbc17a35c133d9d187344b861cd04f74eccacaeff9f03efea43962360caeaa2

See more details on using hashes here.

File details

Details for the file pycof-1.1.11-py3-none-any.whl.

File metadata

  • Download URL: pycof-1.1.11-py3-none-any.whl
  • Upload date:
  • Size: 10.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/2.0.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/41.0.1 requests-toolbelt/0.8.0 tqdm/4.35.0 CPython/3.7.4

File hashes

Hashes for pycof-1.1.11-py3-none-any.whl
Algorithm Hash digest
SHA256 ac6d72d2850393c0f7ede125ea1ef797f623dabd6b59d1da6683453f514f3eff
MD5 e4d84fcfe260591cbcef200288ba71b4
BLAKE2b-256 7cc387736de89bb780500e6fecc0d6e32bc33a34e4cd256de150ea49100f3559

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page