Skip to main content

Python client for GCP Resource Analysis with security, compliance & optimization insights

Project description

GCP Resource Analysis

๐Ÿ” Comprehensive Google Cloud Platform resource analysis for security, compliance, and optimization

A Python package that provides Azure Resource Graph equivalent functionality for Google Cloud Platform, enabling deep analysis of your GCP resources using Cloud Asset Inventory.

License: MIT Python 3.8+ Google Cloud

๐ŸŽฏ Features

๐Ÿ“Š Comprehensive Analysis

  • Storage Analysis: Cloud Storage, Cloud SQL, BigQuery, Persistent Disks
  • Compute Analysis: Compute Engine, GKE, Cloud Run, App Engine
  • Network Analysis: VPC, Firewall Rules, Load Balancers
  • IAM Analysis: Service Accounts, Roles, Permissions
  • Container Analysis: GKE clusters, Cloud Run services, Artifact Registry

๐Ÿ›ก๏ธ Security & Compliance

  • Encryption method detection (CMEK vs Google-managed)
  • Public access configuration analysis
  • Network security assessment
  • IAM privilege escalation detection
  • Compliance scoring with detailed findings

๐Ÿ’ฐ Cost Optimization

  • Unused resource identification
  • Right-sizing recommendations
  • Storage class optimization
  • Reserved instance opportunities

๐Ÿ“ˆ Reporting & Analytics

  • Application-based compliance summaries
  • Risk-based resource prioritization
  • CSV/JSON export capabilities
  • HTML compliance reports

๐Ÿš€ Quick Start

Installation

pip install gcp-resource-analysis

Basic Usage

from gcp_resource_analysis import GCPResourceAnalysisClient

# Initialize client
client = GCPResourceAnalysisClient(
    project_ids=["your-project-id-1", "your-project-id-2"]
)

# Run comprehensive analysis
results = client.query_comprehensive_storage_analysis()

# View high-risk resources
for resource in results['storage_security']:
    if resource.is_high_risk:
        print(f"โš ๏ธ {resource.storage_resource}: {resource.compliance_risk}")

# Get compliance summary
summaries = client.get_storage_compliance_summary()
for summary in summaries:
    print(f"๐Ÿ“Š {summary.application}: {summary.compliance_score}% compliance")

Command Line Interface

# Run storage analysis
gcp-analysis storage --projects your-project-id --export-csv

# Run comprehensive analysis
gcp-analysis comprehensive --projects project1,project2 --output report.html

# Get compliance summary
gcp-analysis compliance --projects your-project-id --format json

๐Ÿ“‹ Prerequisites

1. Authentication Setup

Option A: Service Account (Recommended)

# Create service account
gcloud iam service-accounts create gcp-resource-analyzer

# Grant permissions
gcloud projects add-iam-policy-binding YOUR_PROJECT_ID \
    --member="serviceAccount:gcp-resource-analyzer@YOUR_PROJECT_ID.iam.gserviceaccount.com" \
    --role="roles/cloudasset.viewer"

# Create and download key
gcloud iam service-accounts keys create ~/gcp-analyzer-key.json \
    --iam-account=gcp-resource-analyzer@YOUR_PROJECT_ID.iam.gserviceaccount.com

Option B: User Account

gcloud auth application-default login

2. Enable Required APIs

gcloud services enable cloudasset.googleapis.com
gcloud services enable storage.googleapis.com  
gcloud services enable sqladmin.googleapis.com
gcloud services enable compute.googleapis.com

3. IAM Permissions

Your service account or user needs these roles:

  • roles/cloudasset.viewer - View resource inventory
  • roles/storage.objectViewer - Analyze storage resources
  • roles/cloudsql.viewer - Analyze Cloud SQL instances
  • roles/compute.viewer - Analyze compute resources

๐Ÿ“š Documentation

Core Components

๐Ÿ—๏ธ GCPResourceAnalysisClient

Main client class providing all analysis functionality.

client = GCPResourceAnalysisClient(
    project_ids=["project-1", "project-2"],
    credentials_path="/path/to/service-account.json"  # Optional
)

๐Ÿ“ฆ Storage Analysis

# Security analysis
storage_resources = client.query_storage_analysis()

# Access control analysis  
access_results = client.query_storage_access_control()

# Backup analysis
backup_results = client.query_storage_backup_analysis()

# Cost optimization
optimization_results = client.query_storage_optimization()

# Compliance summary
compliance_summaries = client.get_storage_compliance_summary()

๐Ÿ’ป Compute Analysis

# VM and compute security
compute_resources = client.query_compute_analysis()

# Security configurations
security_results = client.query_compute_security()

# Right-sizing opportunities
optimization_results = client.query_compute_optimization()

๐ŸŒ Network Analysis

# Network security
network_resources = client.query_network_analysis()

# Firewall rule analysis
firewall_results = client.query_firewall_analysis()

# Load balancer security
lb_results = client.query_load_balancer_analysis()

Data Models

Storage Resource Model

@dataclass
class GCPStorageResource:
    application: str                 # Application name from labels
    storage_resource: str           # Resource name
    storage_type: str              # Cloud Storage, Cloud SQL, etc.
    encryption_method: str         # CMEK, Google-managed, etc.
    security_findings: str         # Security configuration details
    compliance_risk: str           # Risk level and description
    resource_group: str            # Project ID
    location: str                  # GCP region/zone
    additional_details: str        # Extra configuration info
    resource_id: str              # Full GCP resource identifier

Compliance Summary Model

@dataclass  
class GCPStorageComplianceSummary:
    application: str                    # Application name
    total_storage_resources: int        # Total resource count
    storage_bucket_count: int          # Cloud Storage buckets
    persistent_disk_count: int         # Persistent disks
    cloud_sql_count: int              # Cloud SQL instances
    bigquery_dataset_count: int       # BigQuery datasets
    encrypted_resources: int           # Encrypted resource count
    secure_transport_resources: int    # HTTPS/TLS enabled
    network_secured_resources: int     # Network restrictions
    resources_with_issues: int         # Resources with problems
    compliance_score: float           # Score 0-100
    compliance_status: str            # Status description

๐Ÿ” Analysis Examples

Security Analysis

# Find publicly accessible storage
access_results = client.query_storage_access_control()
public_buckets = [r for r in access_results if r.allows_public_access]

# Find unencrypted resources
storage_results = client.query_storage_analysis()
unencrypted = [r for r in storage_results if not r.is_encrypted]

# High-risk configurations
high_risk = [r for r in storage_results if r.is_high_risk]

Cost Optimization

# Find unused resources
optimization_results = client.query_storage_optimization()
unused = [r for r in optimization_results if "unused" in r.utilization_status.lower()]

# High savings potential
high_savings = [r for r in optimization_results if r.has_high_savings_potential]

# Storage class optimization
storage_class_opps = [r for r in optimization_results 
                     if "lifecycle" in r.optimization_recommendation.lower()]

Compliance Reporting

from gcp_resource_analysis.utils import create_compliance_report

# Generate HTML compliance report
summaries = client.get_storage_compliance_summary()
create_compliance_report(summaries, "compliance_report.html")

# Export to CSV
from gcp_resource_analysis.utils import export_to_csv
export_to_csv(storage_results, "storage_analysis.csv")

๐Ÿ› ๏ธ Advanced Usage

Multi-Project Analysis

# Analyze across multiple projects
client = GCPResourceAnalysisClient(project_ids=[
    "production-project",
    "staging-project", 
    "development-project"
])

results = client.query_comprehensive_analysis()

Custom Filtering

# Filter by application
app_resources = [r for r in storage_results if r.application == "critical-app"]

# Filter by risk level
critical_issues = [r for r in storage_results 
                  if r.compliance_risk.startswith("High")]

# Filter by location
us_resources = [r for r in storage_results 
               if r.location.startswith("us-")]

Rate Limiting Configuration

# Custom rate limiting
client.rate_limiter.max_requests_per_minute = 50

# Manual rate limit check
if client.rate_limiter.can_make_request():
    results = client.query_storage_analysis()

๐Ÿ“Š Sample Output

Storage Analysis Results

๐Ÿ“ฆ Storage Resources Found: 45
โ”œโ”€โ”€ ๐Ÿชฃ Cloud Storage Buckets: 23
โ”œโ”€โ”€ ๐Ÿ’พ Persistent Disks: 12  
โ”œโ”€โ”€ ๐Ÿ—„๏ธ Cloud SQL Instances: 7
โ””โ”€โ”€ ๐Ÿ“ˆ BigQuery Datasets: 3

๐Ÿ” Security Analysis:
โ”œโ”€โ”€ โœ… Encrypted Resources: 42/45 (93%)
โ”œโ”€โ”€ ๐Ÿ” CMEK Encrypted: 15/45 (33%)
โ”œโ”€โ”€ ๐ŸŒ Network Secured: 40/45 (89%)
โ””โ”€โ”€ โš ๏ธ High-Risk Issues: 3

๐Ÿ’ฐ Cost Optimization:
โ”œโ”€โ”€ ๐Ÿ’ก High Savings Potential: 5 resources
โ”œโ”€โ”€ ๐Ÿ“Š Unused Resources: 2 disks
โ””โ”€โ”€ ๐Ÿ”„ Lifecycle Opportunities: 8 buckets

๐Ÿ“ˆ Compliance Summary:
โ”œโ”€โ”€ ๐ŸŸข Excellent (95-100%): 2 applications
โ”œโ”€โ”€ ๐ŸŸก Good (85-94%): 3 applications  
โ”œโ”€โ”€ ๐ŸŸ  Needs Improvement (70-84%): 1 application
โ””โ”€โ”€ ๐Ÿ”ด Critical Issues (<70%): 0 applications

๐Ÿงช Testing

Run Tests

# Install development dependencies
pip install -e ".[dev]"

# Run all tests
pytest

# Run specific test categories
pytest -m unit              # Unit tests only
pytest -m integration       # Integration tests (requires GCP credentials)
pytest -m gcp              # Tests requiring real GCP resources

# Run with coverage
pytest --cov=gcp_resource_analysis --cov-report=html

Test Configuration

# Set up test environment
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/test-service-account.json
export GCP_TEST_PROJECT_ID=your-test-project

# Run integration tests
pytest -m integration

๐Ÿค Contributing

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Make your changes
  4. Add tests for new functionality
  5. Run the test suite (pytest)
  6. Commit your changes (git commit -m 'Add amazing feature')
  7. Push to the branch (git push origin feature/amazing-feature)
  8. Open a Pull Request

Development Setup

# Clone repository
git clone https://github.com/your-org/gcp-resource-analysis.git
cd gcp-resource-analysis

# Create virtual environment
python -m venv venv
source venv/bin/activate  # On Windows: venv\Scripts\activate

# Install in development mode
pip install -e ".[dev]"

# Install pre-commit hooks
pre-commit install

# Run tests
pytest

๐Ÿ“„ License

This project is licensed under the MIT License - see the LICENSE file for details.

๐Ÿ†š Azure Resource Graph Equivalent

This package provides GCP equivalent functionality to Azure Resource Graph:

Azure Resource Graph GCP Resource Analysis
Resource Graph Query Cloud Asset Inventory
Azure Storage Account Cloud Storage Bucket
Azure Managed Disk Persistent Disk
Azure SQL Database Cloud SQL Instance
Azure Cosmos DB BigQuery/Spanner
Azure Key Vault Cloud KMS
Azure Resource Groups GCP Projects
KQL Queries Python-based Analysis

๐Ÿ”— Related Projects

๐Ÿ“ž Support


Made with โค๏ธ for cloud security and governance

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

gcp_resource_analysis-1.0.9.tar.gz (128.6 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

gcp_resource_analysis-1.0.9-py3-none-any.whl (91.9 kB view details)

Uploaded Python 3

File details

Details for the file gcp_resource_analysis-1.0.9.tar.gz.

File metadata

  • Download URL: gcp_resource_analysis-1.0.9.tar.gz
  • Upload date:
  • Size: 128.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.1.0 CPython/3.12.0

File hashes

Hashes for gcp_resource_analysis-1.0.9.tar.gz
Algorithm Hash digest
SHA256 f01f9efb17677630dd28705221d9448e7ad574d4452633610cc08a83a6a2bc72
MD5 417f27e25c3c98770941851435771ed1
BLAKE2b-256 66843d2fcd2ca4b17c93f16cb7a8514425971b782363010972342e0171a168da

See more details on using hashes here.

File details

Details for the file gcp_resource_analysis-1.0.9-py3-none-any.whl.

File metadata

File hashes

Hashes for gcp_resource_analysis-1.0.9-py3-none-any.whl
Algorithm Hash digest
SHA256 7dc2b7eedd44158c8d806a1a77df017682edd27f3fd0c44f70ab1d6ad3ace197
MD5 a26a7a63e5f417e5d3e313ad7be391dc
BLAKE2b-256 f5222f1cb2bd1ec3e7f9d7fe5199f6999d228ab97d6c3afd82511f318a82abb2

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page