Skip to main content

ePump Inspector Library and CLI for Firestore database analysis

Project description

ePump Inspector

A Python library and CLI tool for inspecting and analyzing ePump.app Firestore database data. Designed for both command-line usage and programmatic access for machine learning and data analysis tasks.

Features

  • ๐Ÿ” Database Inspection: Complete read-only access to Firestore database
  • ๐Ÿ“Š ML/Analysis Ready: Core library designed for machine learning workflows
  • ๐Ÿ–ฅ๏ธ Command Line Interface: Convenient CLI for quick database queries
  • ๐ŸŽฏ Type Safety: Full type hints and structured configuration
  • ๐Ÿ” Security First: Comprehensive .gitignore patterns for credentials
  • ๐Ÿ“ˆ Performance: Efficient querying with configurable limits
  • ๐Ÿ Python 3.8+: Modern Python with Poetry package management

Installation

Using Poetry (Recommended)

# Install dependencies
poetry install

# Run CLI commands
poetry run epump-inspector users

# Or activate the virtual environment
poetry shell
epump-inspector users

Using pip

pip install -e .
epump-inspector users

Configuration

Environment Variables

Create a .env file in the project root:

FIREBASE_PROJECT_ID=epump-e3713
FIREBASE_CREDENTIALS=/path/to/service-account.json
DEFAULT_LIMIT=10
LOG_LEVEL=warning

Firebase Setup

  1. Go to Firebase Console
  2. Select your project (epump-e3713)
  3. Go to Project Settings โ†’ Service Accounts
  4. Generate new private key
  5. Save JSON file and set FIREBASE_CREDENTIALS path

Google Colab Setup

For Google Colab environments, use the specialized authentication helpers:

from epump_inspector import DatabaseInspector, InspectorConfig
from epump_inspector.colab_auth import setup_colab_auth, colab_quick_start

# Show setup instructions
colab_quick_start()

# Option 1: Paste service account JSON
service_account_json = '''{"type": "service_account", ...}'''
config_dict = setup_colab_auth(service_account_json)
config = InspectorConfig(**config_dict)

# Option 2: Upload service account file
from google.colab import files
from epump_inspector.colab_auth import setup_colab_from_file

uploaded = files.upload()
file_name = list(uploaded.keys())[0]
config_dict = setup_colab_from_file(file_name)
config = InspectorConfig(**config_dict)

# Use the inspector
inspector = DatabaseInspector(config)
with inspector.connection():
    users = inspector.get_users(limit=5)
    print(f"Found {len(users)} users")

CLI Usage

Basic Commands

# Get all users
epump-inspector users

# Get comprehensive user data (perfect for ML)
epump-inspector user-data --user-email user@example.com

# Get system summary
epump-inspector summary --limit 5

# Get workouts for specific user
epump-inspector workouts --user-email user@example.com

# Get API keys status
epump-inspector api-keys

# Pretty print JSON output
epump-inspector users --pretty

Advanced Commands

# Get specific workout analysis
epump-inspector workout-analyses --user-email user@example.com --workout-id WORKOUT_ID

# Get AI feedback data
epump-inspector ai-feedback --user-email user@example.com

# Check migration status
epump-inspector migration-status --user-email user@example.com

# Generic collection access
epump-inspector collection --path "users/user@example.com/workout_sessions"

Library Usage

Basic Example

from epump_inspector import DatabaseInspector, InspectorConfig

# Using environment configuration
inspector = DatabaseInspector()

# Or custom configuration
config = InspectorConfig(
    project_id="epump-e3713",
    credentials_path="/path/to/credentials.json",
    default_limit=20,
    log_level="info"
)
inspector = DatabaseInspector(config)

# Use context manager for automatic connection handling
with inspector.connection():
    users = inspector.get_users(limit=10)
    workouts = inspector.get_workouts(user_email="user@example.com")
    
    print(f"Found {len(users)} users and {len(workouts)} workouts")

ML/Data Analysis Example

from epump_inspector import DatabaseInspector

inspector = DatabaseInspector()

with inspector.connection():
    # Get comprehensive user data for ML analysis
    user_data = inspector.get_user_data(
        user_email="user@example.com",
        include_analyses=True  # Include workout analyses and AI suggestions
    )
    
    # Extract training data for AI models
    ai_feedback = inspector.get_ai_feedback(limit=1000)
    workout_analyses = inspector.get_workout_analyses(limit=1000)
    
    # Get system overview
    summary = inspector.get_all_users_summary(limit=100)
    
    # User data contains:
    # - workouts, supplements, diets
    # - health_goals, workout_goals, unified_goals
    # - ai_feedback, workout_analyses, ai_suggestions
    # - fitness_profile, active_diet
    # - goal_migration_completed status

Error Handling

from epump_inspector import DatabaseInspector, InspectorError, DataNotFoundError

inspector = DatabaseInspector()

try:
    with inspector.connection():
        # This will raise DataNotFoundError if not found
        workout = inspector.get_workout("user@example.com", "nonexistent-id")
        
except DataNotFoundError as e:
    print(f"Data not found: {e}")
except InspectorError as e:
    print(f"Inspector error: {e}")
except Exception as e:
    print(f"Unexpected error: {e}")

API Reference

Core Classes

DatabaseInspector

Main class for database operations.

Key Methods:

  • get_users(limit=10) - Get all users
  • get_workouts(user_email=None, limit=10) - Get workouts
  • get_user_data(user_email, include_analyses=True) - Comprehensive user data
  • get_all_users_summary(limit=10) - System overview

InspectorConfig

Configuration management.

Properties:

  • project_id - Firebase project ID
  • credentials_path - Path to service account JSON
  • default_limit - Default query limit
  • log_level - Logging level

Available Data Types

  • Users: User profiles and metadata
  • Workouts: Exercise sessions and workout data
  • Supplements: Supplement tracking data
  • Diets: Diet plans and active diet settings
  • Goals: Health goals, workout goals, unified goals
  • AI Data: AI suggestions, feedback, and workout analyses
  • API Keys: System API keys and tokens
  • Migration: Goal migration status

Development

Setup Development Environment

# Clone and install
git clone <repository>
cd tool
poetry install

# Run tests
poetry run pytest

# Code formatting
poetry run black .
poetry run flake8 .
poetry run mypy .

# Build package
poetry build

Project Structure

tool/
โ”œโ”€โ”€ epump_inspector/          # Core library package
โ”‚   โ”œโ”€โ”€ __init__.py          # Package exports
โ”‚   โ”œโ”€โ”€ core.py              # Main DatabaseInspector class
โ”‚   โ”œโ”€โ”€ config.py            # Configuration management
โ”‚   โ”œโ”€โ”€ cli.py               # Command-line interface
โ”‚   โ””โ”€โ”€ exceptions.py        # Custom exceptions
โ”œโ”€โ”€ database/                 # Database abstraction layer
โ”‚   โ”œโ”€โ”€ __init__.py          # Abstract DAO interface
โ”‚   โ””โ”€โ”€ firestore_dao.py     # Firestore implementation
โ”œโ”€โ”€ inspector.py             # Legacy CLI (maintained for compatibility)
โ”œโ”€โ”€ cli.py                   # Standalone CLI wrapper
โ”œโ”€โ”€ example_ml_usage.py      # ML usage examples
โ”œโ”€โ”€ pyproject.toml           # Poetry configuration
โ”œโ”€โ”€ .env.example             # Environment template
โ””โ”€โ”€ README.md                # This file

Security

  • All sensitive files are excluded via .gitignore
  • Credentials are loaded from environment variables
  • No hardcoded API keys or secrets
  • Type-safe configuration validation

Examples

Local Examples

See example_ml_usage.py for comprehensive ML/data analysis examples including:

  • User workout pattern analysis
  • Multi-user performance comparison
  • AI training data extraction
  • System health reporting

Google Colab Examples

See colab_example.py for a complete Google Colab notebook example that includes:

  • Step-by-step Colab authentication setup
  • Data collection and analysis workflows
  • Visualization examples with matplotlib/seaborn
  • ML feature extraction and export
  • Data download for further analysis

Quick Colab Start:

# In Google Colab
!git clone https://github.com/your-repo/epump.app.git
%cd epump.app/tool
!pip install -e .

from epump_inspector.colab_auth import colab_quick_start
colab_quick_start()  # Shows detailed setup instructions

Support

For issues and questions:

  1. Check the examples in example_ml_usage.py
  2. Review the CLI help: epump-inspector --help
  3. Check logs with --log-level debug

License

MIT License - see LICENSE file for details.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

epump_inspector-1.0.4.tar.gz (18.0 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

epump_inspector-1.0.4-py3-none-any.whl (19.4 kB view details)

Uploaded Python 3

File details

Details for the file epump_inspector-1.0.4.tar.gz.

File metadata

  • Download URL: epump_inspector-1.0.4.tar.gz
  • Upload date:
  • Size: 18.0 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.2 CPython/3.13.3 Darwin/24.6.0

File hashes

Hashes for epump_inspector-1.0.4.tar.gz
Algorithm Hash digest
SHA256 6dcd991f8b69f8568cbc6a9b0a949cb7a65c3dcd2bd6387ba3514019847c2136
MD5 8115da931791561b1c9f8cd57cd92093
BLAKE2b-256 a00b3d27094d8214edf196c14e3a6e8b1f2ed7a7a983dd8c78ac023e047759a6

See more details on using hashes here.

File details

Details for the file epump_inspector-1.0.4-py3-none-any.whl.

File metadata

  • Download URL: epump_inspector-1.0.4-py3-none-any.whl
  • Upload date:
  • Size: 19.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/2.1.2 CPython/3.13.3 Darwin/24.6.0

File hashes

Hashes for epump_inspector-1.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 e544240691f0fbbefe6733e11ae78c70b7a03ddd23566f5661e669d9e03b0f6a
MD5 c146cd4c008c8ae09cb3d976c453af5a
BLAKE2b-256 88a2f63d9991596cc71a635ce7bb1332e0dae0f858a22467f31e95ad063d071a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page