Skip to main content

Solution for DS Team

Project description

utilsds

Utilsds is a library that includes classes and functions used in data science projects such as:

  • algorithm:

    • Algorithm: Base class for fitting, training, and getting hyperparameters of machine learning models.
  • data_ops:

    • DataOperations: Handle data operations locally and with Google Cloud services (BigQuery and Cloud Storage).
    • BigQuery operations:
      • load_bq_data: Load data from tables, views, and SQL files.
      • save_bq_view, save_bq_table: Save views and tables.
      • load_bq_procedure: Execute stored procedures.
      • load_bq_details: Get table/view details and schema.
      • delete_bq_data: Delete data with safety confirmations.
      • dry_run: Perform dry runs to estimate query costs.
    • Cloud Storage operations:
      • save_gcs_bucket: Create buckets.
      • save_gcs_file, load_gcs_file: Save and load files (.pkl, .json, .csv, .html, .sql).
    • Local file operations:
      • save_local_file, load_local_file: Save and load files (.pkl, .json, .csv, .html, .sql).
  • data_processing:

    • SkewnessTransformer: Transform skewed data using various methods (IHS, neglog, Yeo-Johnson, quantile).
    • NullReplacer: Replace null values in specified columns with configurable strategies.
    • ColumnDropper: Drop specified columns from a DataFrame.
    • OutliersCleaner: Clean outliers by clipping values outside specified percentile ranges.
    • CategoricalMapper: Map values in categorical columns according to a specified mapping scheme.
    • NumericalMapper: Convert numerical columns to categorical by binning.
    • Encoder: One-hot encode categorical columns in the data.
    • Normalizer: Normalize numerical columns using a provided scaler.
  • data_split:

    • train_test_validation_split: Split data into training, testing, and validation sets.
    • resample_X_y: resample train data and target column.
  • ds_statistics:

    • test_kruskal_wallis: Perform the Kruskal-Wallis statistical test.
    • test_agosto_pearsona: Test for normality using D'Agostino-Pearson test.
  • evaluate:

    • ModelEvaluator: Evaluate models and generate plots for diagnostics.
    • ShapExplainer: Explain model predictions using SHAP values.
  • experiments:

    • VertexExperiment: Manage experiments with Vertex AI.
  • optuna:

    • Optuna: Optimize hyperparameters using Optuna.
  • metrics:

    • Metrics: Calculate metrics for both classification and regression models.
  • modeling:

    • Modeling: Manage modeling, metrics, and logging with Vertex AI.
  • Supervised:

    • LazyClassifier: A classifier that automatically trains and evaluates multiple models.
    • LazyRegressor: A regressor that automatically trains and evaluates multiple models.
    • get_card_split: Function to split data into card-like groups.
    • adjusted_rsquared: Calculate adjusted R-squared for regression models.
  • visualization:

    • MetricsPlot: Compare metrics for different parameter values.
    • Radar: Create radar plots for visualizing data.
    • cluster_characteristics: Analyze cluster characteristics.
    • comparison_density: Compare density distributions.
    • elbow_visualisation: Visualize the elbow method for clustering.
    • describe_clusters_metrics: Describe metrics for clusters.
    • category_null_variables: Visualize null variables in categorical data.
    • normal_distr_plots: Visualize normal distribution plots.
    • distplot_limitations: Visualize limitations of distplot.
    • boxplot_limitations: Visualize limitations of boxplot.
    • violinplot_limitations: Visualize limitations of violinplot.
    • countplot_limitations: Visualize limitations of countplot.
    • categorical_variable_perc: Visualize percentage of categorical variables.
    • spearman_correlation: Visualize spearman correlation.
    • calculate_crammers_v: Calculate Crammer's V.
  • what_if_streamlit:

    • ShapSaver: Save SHAP explainer components for lazy loading in what-if analysis.
    • ColumnMetadataGenerator: Generate column metadata from a DataFrame or CSV file.
  • monitoring:

    • mapping: Create column mapping from configuration file for Evidently.
    • test_data: Test data for issues using Evidently test suites.
    • check_data_drift: Check data for drift using Evidently metrics.
    • send_email_with_table: Send email notifications with HTML tables for monitoring alerts.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

utilsds-2.0.0.tar.gz (48.8 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

utilsds-2.0.0-py3-none-any.whl (52.3 kB view details)

Uploaded Python 3

File details

Details for the file utilsds-2.0.0.tar.gz.

File metadata

  • Download URL: utilsds-2.0.0.tar.gz
  • Upload date:
  • Size: 48.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for utilsds-2.0.0.tar.gz
Algorithm Hash digest
SHA256 482b15ffe0759cc13750f6d129883ab4b7ebca09cafea65720c8c15f21fb3393
MD5 33e1ad40eb0fb1fdea8b000f47ac4416
BLAKE2b-256 738458edfa9f79fb4f8c3b6cf4a773f64c21fb63e85a99d8b162738a14484a11

See more details on using hashes here.

File details

Details for the file utilsds-2.0.0-py3-none-any.whl.

File metadata

  • Download URL: utilsds-2.0.0-py3-none-any.whl
  • Upload date:
  • Size: 52.3 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.3

File hashes

Hashes for utilsds-2.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 143b5308ede3843eda072ad0775927d84590994291499bb43b68ca793934cbee
MD5 bd889dca8b0cea367ff12f2b4679ccb7
BLAKE2b-256 c318adf3470c202a35a4315ed33a4dfef3bb6128349a53d97797e289896fae71

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page