Skip to main content

A sample test package

Project description

afl-ai-utils

rm -rf build dist 
python3 setup.py sdist bdist_wheel
twine upload --repository pypi dist/* 

Installation

    pip install afl-ai-utils

Usage

Slack Alerting

from afl_ai_utils.slack_alerts import send_slack_alert 
send_slack_alert(info_alert_slack_webhook_url=None, red_alert_slack_webhook_url=None, slack_red_alert_userids=None, payload=None, is_red_alert=False)

    """Send a Slack message to a channel via a webhook.

Args:
    info_alert_slack_webhook_url(str): Infor slack channel url
    red_alert_slack_webhook_url(str): red alert channel url
    slack_red_alert_userids (list): userid's to mention in slack for red alert notification
    payload (dict): Dictionary containing Slack message, i.e. {"text": "This is a test"}
    is_red_alert (bool): Full Slack webhook URL for your chosen channel.

Returns:
    HTTP response code, i.e. <Response [503]>
"""

BigQuery Dataframe to BigQuery and get result in Datafeame

    def write_insights_to_bq_table(self, dataframe=None, schema=None, table_id=None, mode=None):

    >>> from afl_ai_utils.bigquery_utils import BigQuery
    >>> bq = BigQuery("keys.json")
    >>> bq.write_insights_to_bq_table(dataframe=None, schema=None, table_id=None, mode=None)
    
    
    """Insert a dataframe to bigquery

    Args:
        dataframe(pandas dataframe): for dataframe to be dumped to bigquery
        schema(BigQuery.Schema ): ex:
            schema = [
                        bigquery.SchemaField("date_range_start", bigquery.enums.SqlTypeNames.DATE),
                        bigquery.SchemaField("date_range_end", bigquery.enums.SqlTypeNames.DATE)
                    ]

        table_id (list): table_id in which dataframe need to be inserted e.g project_id.dataset.table_name = table_id
        mode(str): To append or replace the table - e.g mode = "append"  or mode="replace"
    Returns:
        returns as success message with number of inserted rows and table name
    """

Execute any query to BigQuery

    def execute_query(self, query):

    >>> from afl_ai_utils.bigquery_utils import BigQuery
    >>> bq = BigQuery("keys.json")
    >>> df = bq.execute_query(query = "SELECT * FROM TABLE")
    
    
    """
    Args:
        query (query of any type SELECT/INSERT/DELETE ) 
    Returns:
        returns dataframe of execute query result
    """

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

afl-ai-utils-0.0.9.tar.gz (3.6 kB view details)

Uploaded Source

Built Distribution

afl_ai_utils-0.0.9-py3-none-any.whl (5.0 kB view details)

Uploaded Python 3

File details

Details for the file afl-ai-utils-0.0.9.tar.gz.

File metadata

  • Download URL: afl-ai-utils-0.0.9.tar.gz
  • Upload date:
  • Size: 3.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.6

File hashes

Hashes for afl-ai-utils-0.0.9.tar.gz
Algorithm Hash digest
SHA256 777848a92db546f72160bb8261261025c6bb0ba8d553e42353ed0473879cbb56
MD5 1461afb1d7da8e9b35c5009215e751f5
BLAKE2b-256 fb17fc9402fe7fac72462a352aaac541f0d9f83c3fe63d2d907c431f0e3ace5e

See more details on using hashes here.

File details

Details for the file afl_ai_utils-0.0.9-py3-none-any.whl.

File metadata

File hashes

Hashes for afl_ai_utils-0.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 15e758831e981c0152eac146c464c0b4fce876d1f10b084ec9905a5214108f49
MD5 1ec0e531c892191da8f3cf7adb2a7051
BLAKE2b-256 8e7fe0d8db4a81105e66aadd880fe683ca9b5c0c03d70e345c20ced217057035

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page