Skip to main content

A collection of helper modules for interacting with Google Cloud Platform services, designed to simplify common workflows.

Project description

SIBR Module

A collection of helper modules for interacting with Google Cloud Platform services, designed to simplify common workflows. This package provides easy-to-use classes for BigQuery, Google Cloud Storage, Secret Manager, and Cloud Logging.

Features

  • BigQuery: Easily upload DataFrames to BigQuery tables with automatic schema detection and support for append, replace, and merge operations.
  • CStorage: Upload and download files to and from Google Cloud Storage buckets.
  • SecretsManager: Securely access secrets from Google Secret Manager.
  • Logger: A flexible logger that supports both local file logging and integration with Google Cloud Logging.

Installation

You can install the package from PyPI:

pip install sibr-module

Quickstart

Ensure you are authenticated with Google Cloud. You can do this by running:

gcloud auth application-default login \

If you are using this locally, make sure to set your credentials as environment variables using: GOOGLE_APPLICATION_CREDENTIALS=path-to-your-credentials.

import pandas as pd 
from sibr_module import BigQuery, Logger 
from dotenv import load_dotenv
load_dotenv()

1. Set up a logger

This will log to a local file and, if enabled, to Google Cloud Logging \

logger = Logger(log_name="my_app_logger", enable_cloud_logging=True) 
logger.info("Application starting up.")

2. Prepare your data

data = {'name': ['Alice', 'Bob'], 'score': [85, 92]} 
my_dataframe = pd.DataFrame(data) 

3. Use the BigQuery helper

try: 
    # Initialize the client with your Google Cloud Project ID
    bq_client = BigQuery(project_id="your-gcp-project-id", logger=logger)

    # Upload the DataFrame to a BigQuery table \
    bq_client.to_bq(
        df=my_dataframe,
        dataset_name="my_dataset", 
        table_name="my_table", 
        if_exists="append"  # Options: 'append', 'replace', or 'merge' 
    ) 
 
    logger.info("Successfully uploaded data to BigQuery.") 
 
except Exception as e: 
    logger.error(f"An error occurred: {e}") 

3. Use the CStorage helper

try:
    # Create a dummy file to upload
    with open("my_local_file.txt", "w") as f:
        f.write("This is a test file.")

    # Initialize the client
    storage_client = CStorage(project_id="your-gcp-project-id", bucket_name="your-bucket-name", logger=logger)

    # Upload the file
    storage_client.upload(
        local_file_path="my_local_file.txt",
        destination_blob_name="my_remote_folder/my_remote_file.txt"
    )
    logger.info("Successfully uploaded file to GCS.")

    # Download the file
    storage_client.download(
        source_blob_name="my_remote_folder/my_remote_file.txt",
        destination_file_path="downloaded_file.txt"
    )
    logger.info("Successfully downloaded file from GCS.")

except Exception as e:
    logger.error(f"An error occurred with Cloud Storage: {e}")

Usage Details

BigQuery

The BigQuery class handles interactions with Google BigQuery.

  • to_bq(df, dataset_name, table_name, if_exists='append', merge_on=None): Uploads a pandas DataFrame.
    • if_exists='append': Adds data to an existing table.
    • if_exists='replace': Deletes the existing table and creates a new one.
    • if_exists='merge': Updates existing rows and inserts new ones. Requires merge_on to be set with a list of key columns.
  • read_bq(query): Executes a query and returns the result as a pandas DataFrame.
  • dtype_map: Optional argument if the user wishes to specify a certain mapping of dtypes to BigQuery types.

CStorage

The CStorage class simplifies file operations with Google Cloud Storage.

  • upload(local_file_path, destination_blob_name): Uploads a local file to the bucket.
  • download(source_blob_name, destination_file_path): Downloads a file from the bucket.

SecretsManager

Access your secrets easily.

  • get_secret(secret_id): Retrieves the latest version of a secret by its ID.

Logger

A flexible logger for both local and cloud-based logging.

  • init(log_name, root_path=None, write_type='a', enable_cloud_logging=False, cloud_log_name=None): Initializes the logger.
  • log_level: Property to get or set the logging level (e.g., 'INFO', 'DEBUG').

Contributing

Contributions are welcome! Please open an issue or submit a pull request if you have any improvements or bug fixes.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

sibr_module-0.2.8.tar.gz (15.2 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

sibr_module-0.2.8-py3-none-any.whl (14.8 kB view details)

Uploaded Python 3

File details

Details for the file sibr_module-0.2.8.tar.gz.

File metadata

  • Download URL: sibr_module-0.2.8.tar.gz
  • Upload date:
  • Size: 15.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.4 CPython/3.13.5 Darwin/24.6.0

File hashes

Hashes for sibr_module-0.2.8.tar.gz
Algorithm Hash digest
SHA256 0c55f1b80b9d8c4c7ba09041c57a247b2c82968f8c6a1636d9612a41a2a6bee5
MD5 d6f0d5265d4b72e64d4f3a5193d49147
BLAKE2b-256 2bb7ecf181d92d08b93f2fca588fe2fcd58e6a778e99d7d9313916cefc0e01c9

See more details on using hashes here.

File details

Details for the file sibr_module-0.2.8-py3-none-any.whl.

File metadata

  • Download URL: sibr_module-0.2.8-py3-none-any.whl
  • Upload date:
  • Size: 14.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.4 CPython/3.13.5 Darwin/24.6.0

File hashes

Hashes for sibr_module-0.2.8-py3-none-any.whl
Algorithm Hash digest
SHA256 91b4276f80851e1864dcbea3d442ba41516f1fcf9d4aea43a223364ffa5b539c
MD5 0dee5b2f6985c88a604135eae287c61e
BLAKE2b-256 5c5a2a2d654e64fbbe1e98e5b9b526877c73afa2ac69dcb720db51993540610c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page