Spartan, often referred to as "The swiss army knife for serverless development," is a tool that simplifies the creation of serverless applications on popular cloud providers by generating Python code for classes and more. It streamlines your development process, saving you time and ensuring code consistency in your serverless projects.
Project description
Spartan CLI for Spartan Serverless Framework
Spartan CLI is the official command-line interface for the Spartan Serverless Framework, the Swiss Army knife for serverless development. The CLI provides powerful commands and utilities to scaffold, manage, and operate serverless applications built with the Spartan Framework on AWS, GCP, and (soon) Azure.
What is Spartan CLI?
Spartan CLI is the companion tool for the Spartan Serverless Framework, a Python-based scaffold and library for building scalable, consistent serverless applications (APIs, workflows, ETL, microservices, and more). The CLI lets you generate, configure, deploy, and manage these applications from the command line, automating common tasks and ensuring best practices.
Use Spartan CLI to:
- Generate handler and workflow code compatible with the Spartan Framework
- Manage cloud resources and deployments
- Validate and inspect your project configuration
- Run, test, and debug serverless workflows and functions
- Integrate with CI/CD and automate your serverless development workflow
Note: Spartan CLI is tightly integrated with the Spartan Framework and is the recommended way to interact with Spartan-based projects.
Why Spartan?
Spartan (Framework + CLI) is often called "the Swiss Army knife for serverless development." It simplifies the creation of serverless applications on popular cloud providers by generating Python code, scaffolding best practices, and providing a unified developer experience. Spartan streamlines your development process, saving you time and ensuring code consistency in your serverless projects.
๐ Quick Start
Installation
# Clone the repository
git clone <repository-url>
cd spartan-cli
# Install dependencies
make dev-install
# Run the CLI
poetry run spartan --help
Basic Usage
# Show available commands
spartan --help
# Workflow operations (AWS Step Functions / GCP Workflows)
spartan workflow --help
spartan workflow list
spartan workflow view my-workflow
spartan workflow run my-workflow
# S3 operations
spartan s3 --help
# Job operations
spartan job --help
# Parquet operations
spartan parquet --help
๐ Configuration
Spartan CLI supports project-level configuration through a .spartan configuration file. This allows you to specify cloud provider preferences and other settings on a per-project basis.
Configuration File Format
The .spartan file uses INI format with a [default] section:
[default]
provider = aws
Supported Providers:
aws- Amazon Web Servicesgcp- Google Cloud Platform (default)
Configuration File Discovery
Spartan automatically discovers your configuration file by:
- Starting from your current working directory
- Traversing upward through parent directories
- Using the first
.spartanfile found - Stopping at your home directory or filesystem root
This means you can run Spartan commands from any subdirectory in your project, and it will automatically find and use your project's configuration.
Example Directory Structure:
my-project/
โโโ .spartan # Project configuration
โโโ src/
โ โโโ handlers/ # Run commands from here
โโโ tests/ # Or from here
Default Behavior
If no .spartan file is found, Spartan uses these defaults:
- Provider:
gcp(Google Cloud Platform) - No warnings or errors are displayed
This allows you to use Spartan immediately without any configuration setup.
Creating a Configuration File
Create a .spartan file in your project root:
# Create configuration file
cat > .spartan << EOF
[default]
provider = aws
EOF
Or manually create the file with your preferred editor:
[default]
provider = aws
Configuration Commands
Spartan provides commands to manage and validate your configuration:
# Validate your configuration file
spartan config --validate
# Show current configuration
spartan config --show
Validation Output:
$ spartan config --validate
โ Configuration is valid
Provider: aws
Config file: /path/to/project/.spartan
Show Configuration Output:
$ spartan config --show
Current Configuration:
Provider: aws
Config file: /path/to/project/.spartan
Configuration Errors
Spartan provides clear error messages when configuration issues occur:
Invalid Provider:
$ spartan config --validate
โ Configuration Error: Invalid provider 'azure' in configuration.
Valid options are: 'aws', 'gcp'.
Invalid INI Syntax:
$ spartan config --validate
โ Configuration Error: Invalid configuration file syntax at line 2: ...
Please check INI format.
Permission Issues:
$ spartan config --validate
โ Configuration Error: Cannot read configuration file '.spartan': Permission denied.
Please check file permissions.
Example Configurations
AWS Configuration:
[default]
provider = aws
GCP Configuration:
[default]
provider = gcp
Empty File (uses defaults):
# Empty file or no provider key defaults to GCP
[default]
๐ Workflow Commands
Spartan CLI provides comprehensive workflow management commands for orchestrating serverless workflows on both AWS Step Functions and GCP Workflows. The commands automatically adapt based on your configured cloud provider.
Overview
The workflow commands allow you to:
- List and discover available workflows
- View workflow definitions and metadata
- Execute workflows with custom input
- Monitor execution history
- View detailed execution logs
Provider Support
- AWS: Manages AWS Step Functions state machines
- GCP: Manages Google Cloud Platform Workflows
The CLI automatically uses the provider configured in your .spartan file, or you can override it using the --provider flag.
Available Commands
List Workflows
List all available workflows in your cloud account:
# List all workflows (uses configured provider)
spartan workflow list
# List workflows with specific provider
spartan workflow list --provider aws
spartan workflow list --provider gcp
# Filter by status
spartan workflow list --status ACTIVE
# Different output formats
spartan workflow list --output json
spartan workflow list --output yaml
spartan workflow list --output table # default
Output Example:
โโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโ
โ Name โ Status โ Created โ
โโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโค
โ my-workflow โ ACTIVE โ 2024-01-15 10:30:00 โ
โ data-processor โ ACTIVE โ 2024-01-10 14:20:00 โ
โโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโ
View Workflow Definition
View the complete definition and metadata for a specific workflow:
# View workflow definition
spartan workflow view my-workflow
# View with specific provider
spartan workflow view my-workflow --provider aws
# Output in different formats
spartan workflow view my-workflow --output json
spartan workflow view my-workflow --output yaml
AWS Output Example:
{
"name": "my-workflow",
"arn": "arn:aws:states:us-east-1:123456789012:stateMachine:my-workflow",
"status": "ACTIVE",
"type": "STANDARD",
"definition": {
"Comment": "My workflow definition",
"StartAt": "FirstState",
"States": {
"FirstState": {
"Type": "Task",
"Resource": "arn:aws:lambda:...",
"End": true
}
}
}
}
GCP Output Example:
name: my-workflow
createTime: '2024-01-15T10:30:00Z'
revisionId: '000001-abc'
state: ACTIVE
sourceContents: |
- step1:
call: http.get
args:
url: https://api.example.com/data
result: apiResponse
- step2:
return: ${apiResponse.body}
Run Workflow
Execute a workflow with optional input:
# Run workflow without input
spartan workflow run my-workflow
# Run with JSON input
spartan workflow run my-workflow --input '{"key": "value"}'
# Run with input from file
spartan workflow run my-workflow --input-file input.json
# Skip confirmation prompt (for automation)
spartan workflow run my-workflow --yes
# Run with specific provider
spartan workflow run my-workflow --provider gcp
Output Example:
โ Workflow execution started successfully
Execution ID: abc123-def456-ghi789
Resource: arn:aws:states:us-east-1:123456789012:execution:my-workflow:abc123-def456-ghi789
Status: RUNNING
List Executions
View execution history for a workflow:
# List recent executions (default: 10)
spartan workflow executions my-workflow
# List more executions
spartan workflow executions my-workflow --limit 50
# Filter by status
spartan workflow executions my-workflow --status SUCCEEDED
spartan workflow executions my-workflow --status FAILED
# Different output formats
spartan workflow executions my-workflow --output json
Output Example:
โโโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโ
โ Execution ID โ Status โ Start Time โ End Time โ
โโโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโค
โ abc123-def456 โ SUCCEEDED โ 2024-01-15 10:30:00 โ 2024-01-15 10:31:00 โ
โ xyz789-uvw012 โ FAILED โ 2024-01-15 09:15:00 โ 2024-01-15 09:16:00 โ
โ mno345-pqr678 โ RUNNING โ 2024-01-15 10:35:00 โ - โ
โโโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโ
View Execution Logs
View detailed logs and state transitions for a specific execution:
# View execution logs
spartan workflow execution logs abc123-def456
# View with specific provider
spartan workflow execution logs abc123-def456 --provider aws
# Stream logs in real-time (for running executions)
spartan workflow execution logs abc123-def456 --follow
# Output in different formats
spartan workflow execution logs abc123-def456 --output json
Output Example:
Execution: abc123-def456
Status: SUCCEEDED
Duration: 1m 23s
State Transitions:
โโโโโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโ
โ State โ Event Type โ Timestamp โ
โโโโโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโค
โ ExecutionStarted โ START โ 2024-01-15 10:30:00 โ
โ FirstState โ TASK_STARTED โ 2024-01-15 10:30:01 โ
โ FirstState โ TASK_SUCCEEDED โ 2024-01-15 10:30:45 โ
โ SecondState โ TASK_STARTED โ 2024-01-15 10:30:46 โ
โ SecondState โ TASK_SUCCEEDED โ 2024-01-15 10:31:20 โ
โ ExecutionSucceeded โ END โ 2024-01-15 10:31:23 โ
โโโโโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโ
Authentication Setup
AWS Authentication
Configure AWS credentials using one of these methods:
Option 1: AWS CLI Configuration
# Configure AWS credentials
aws configure
# Or set environment variables
export AWS_ACCESS_KEY_ID=your_access_key
export AWS_SECRET_ACCESS_KEY=your_secret_key
export AWS_DEFAULT_REGION=us-east-1
Option 2: AWS Profile
# Use a specific AWS profile
spartan workflow list --profile my-profile
Required AWS Permissions:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"states:ListStateMachines",
"states:DescribeStateMachine",
"states:StartExecution",
"states:ListExecutions",
"states:DescribeExecution",
"states:GetExecutionHistory"
],
"Resource": "*"
}
]
}
GCP Authentication
Configure GCP credentials using one of these methods:
Option 1: Application Default Credentials
# Login with your user account
gcloud auth application-default login
# Or set service account credentials
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account-key.json
Option 2: Service Account Key
# Download service account key from GCP Console
# Set environment variable
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/key.json
# Set project ID
export GOOGLE_CLOUD_PROJECT=your-project-id
Required GCP Permissions:
workflows.workflows.listworkflows.workflows.getworkflows.executions.createworkflows.executions.listworkflows.executions.get
GCP IAM Role:
# Grant Workflows Admin role to service account
gcloud projects add-iam-policy-binding PROJECT_ID \
--member="serviceAccount:SERVICE_ACCOUNT_EMAIL" \
--role="roles/workflows.admin"
Configuration Examples
AWS Configuration
# .spartan file for AWS
[default]
provider = aws
# Set AWS region (optional)
export AWS_DEFAULT_REGION=us-east-1
# Use specific AWS profile (optional)
export AWS_PROFILE=my-profile
GCP Configuration
# .spartan file for GCP
[default]
provider = gcp
# Set GCP project (required)
export GOOGLE_CLOUD_PROJECT=my-project-id
# Set GCP location (optional, defaults to us-central1)
export GOOGLE_CLOUD_LOCATION=us-central1
# Set service account credentials (if not using ADC)
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/key.json
Troubleshooting
Common Issues
1. Credentials Not Configured
AWS Error:
Error: AWS credentials not configured
Suggestion: Run 'aws configure' to set up your AWS credentials.
Solution:
aws configure
# Or set environment variables
export AWS_ACCESS_KEY_ID=your_key
export AWS_SECRET_ACCESS_KEY=your_secret
GCP Error:
Error: GCP credentials not configured
Suggestion: Run 'gcloud auth application-default login' or set GOOGLE_APPLICATION_CREDENTIALS.
Solution:
gcloud auth application-default login
# Or set service account key
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/key.json
2. Permission Denied
Error:
Error: Permission denied: User lacks required permissions
Solution:
- AWS: Ensure your IAM user/role has the required Step Functions permissions
- GCP: Ensure your service account has the Workflows Admin role or equivalent permissions
3. Workflow Not Found
Error:
Error: Workflow 'my-workflow' not found
Solution:
- Verify the workflow name is correct
- Check you're using the correct provider (
--provider awsor--provider gcp) - Verify you're in the correct region/project
- List all workflows to see available names:
spartan workflow list
4. Invalid JSON Input
Error:
Error: Invalid JSON input: Expecting property name enclosed in double quotes
Solution:
- Ensure JSON is properly formatted with double quotes
- Use a JSON validator to check your input
- Consider using
--input-filefor complex JSON
5. Rate Limiting
Error:
Error: Rate limit exceeded. Please retry after a delay.
Solution:
- Wait a few seconds and retry
- Reduce the frequency of API calls
- Consider implementing exponential backoff in automation scripts
Debug Mode
Enable verbose output for troubleshooting:
# Set log level to debug
export SPARTAN_LOG_LEVEL=DEBUG
# Run command with verbose output
spartan workflow list -v
Getting Help
# Show help for workflow commands
spartan workflow --help
# Show help for specific command
spartan workflow list --help
spartan workflow run --help
Advanced Usage
Automation Scripts
#!/bin/bash
# Example: Run workflow and wait for completion
# Run workflow
EXECUTION_ID=$(spartan workflow run my-workflow \
--input '{"data": "value"}' \
--yes \
--output json | jq -r '.execution_id')
echo "Started execution: $EXECUTION_ID"
# Poll for completion
while true; do
STATUS=$(spartan workflow execution logs $EXECUTION_ID \
--output json | jq -r '.status')
if [[ "$STATUS" == "SUCCEEDED" ]]; then
echo "Execution completed successfully"
break
elif [[ "$STATUS" == "FAILED" ]]; then
echo "Execution failed"
exit 1
fi
sleep 5
done
Multi-Provider Workflows
# Run same workflow on both providers
spartan workflow run my-workflow --provider aws --input '{"env": "aws"}'
spartan workflow run my-workflow --provider gcp --input '{"env": "gcp"}'
# Compare execution times
spartan workflow executions my-workflow --provider aws --limit 1
spartan workflow executions my-workflow --provider gcp --limit 1
CI/CD Integration
# GitHub Actions example
name: Deploy Workflow
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: '3.11'
- name: Install Spartan
run: pip install python-spartan
- name: Configure AWS
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
run: |
spartan workflow run deployment-workflow \
--input-file deploy-config.json \
--yes
๐ง Handler Commands
Spartan CLI provides comprehensive serverless function management commands for both AWS Lambda and GCP Cloud Functions. The commands automatically adapt based on your configured cloud provider, allowing you to create, list, describe, and download serverless functions with a unified interface.
Overview
The handler commands allow you to:
- Create handler files from templates with appropriate boilerplate
- List deployed serverless functions with filtering and sorting
- View detailed function configuration and metadata
- Download function source code for local development
- Delete local handler files
Provider Support
- AWS: Manages AWS Lambda functions
- GCP: Manages Google Cloud Platform Cloud Functions (1st and 2nd generation)
The CLI automatically uses the provider configured in your .spartan file.
Available Commands
Create Handler File
Create a new handler file from a template with appropriate boilerplate code:
# Create a basic handler (uses configured provider)
spartan handler create my-handler
# Create handler with HTTP trigger (GCP)
spartan handler create api-handler --subscribe http
# Create handler with Pub/Sub trigger (GCP)
spartan handler create event-handler --subscribe pubsub
# Create handler with Cloud Storage trigger (GCP)
spartan handler create storage-handler --subscribe storage
# Create handler with SQS trigger (AWS)
spartan handler create queue-handler --subscribe sqs
# Create handler with SNS trigger (AWS)
spartan handler create notification-handler --subscribe sns
Output Example:
โ Handler file created successfully
File: handlers/my_handler.py
Template: GCP HTTP trigger
Available Triggers:
GCP Triggers:
http- HTTP/HTTPS requestspubsub- Pub/Sub messagesstorage- Cloud Storage eventsfirestore- Firestore document changesscheduler- Cloud Scheduler jobs
AWS Triggers:
sqs- SQS queue messagessns- SNS topic notificationss3- S3 bucket eventsapi- API Gateway requests- And many more (see AWS Lambda documentation)
List Handlers
List all deployed serverless functions with filtering and sorting:
# List all functions (uses configured provider)
spartan handler list
# List with specific provider
spartan handler list --provider aws
spartan handler list --provider gcp
# Filter by name prefix
spartan handler list --prefix my-
# Filter by regex pattern
spartan handler list --match "^api-.*"
# Filter by runtime
spartan handler list --runtime python311
# Sort by different fields
spartan handler list --sort name --order asc
spartan handler list --sort memory --order desc
spartan handler list --sort modified --order desc
# Different output formats
spartan handler list --output json
spartan handler list --output yaml
spartan handler list --output table # default
spartan handler list --output markdown
spartan handler list --output csv
# Limit results
spartan handler list --limit 10
# Show applied filters
spartan handler list --prefix api- --show-filters
# Save output to file
spartan handler list --output json --save-to functions.json
Output Example (Table Format):
โโโโโโโโโโโโโโโโโโโโฌโโโโโโโโโโโโฌโโโโโโโโโฌโโโโโโโโโโฌโโโโโโโโโโโโโโโโโโโโโโ
โ Name โ Runtime โ Memory โ Timeout โ Last Modified โ
โโโโโโโโโโโโโโโโโโโโผโโโโโโโโโโโโผโโโโโโโโโผโโโโโโโโโโผโโโโโโโโโโโโโโโโโโโโโโค
โ api-handler โ python311 โ 256 MB โ 60s โ 2024-01-15 10:30:00 โ
โ event-processor โ python311 โ 512 MB โ 120s โ 2024-01-14 15:20:00 โ
โ data-transformer โ nodejs20 โ 1024MB โ 300s โ 2024-01-13 09:45:00 โ
โโโโโโโโโโโโโโโโโโโโดโโโโโโโโโโโโดโโโโโโโโโดโโโโโโโโโโดโโโโโโโโโโโโโโโโโโโโโโ
Output Example (JSON Format):
[
{
"name": "api-handler",
"runtime": "python311",
"memory": 256,
"timeout": 60,
"modified": "2024-01-15 10:30:00",
"description": "API request handler",
"handler": "main.handler",
"arn": "projects/my-project/locations/us-central1/functions/api-handler"
}
]
Describe Handler
View detailed information about a specific function:
# Describe a function (uses configured provider)
spartan handler describe my-function
# Describe with specific provider
spartan handler describe my-function --provider aws
spartan handler describe my-function --provider gcp
# Different output formats
spartan handler describe my-function --output json
spartan handler describe my-function --output yaml
spartan handler describe my-function --output table # default
spartan handler describe my-function --output text
spartan handler describe my-function --output markdown
# GCP-specific: specify project and location
spartan handler describe my-function --project-id my-project --location us-central1
# AWS-specific: specify region and profile
spartan handler describe my-function --region us-east-1 --profile production
Output Example (AWS Lambda):
Function Details: my-function
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Basic Information:
Name: my-function
ARN: arn:aws:lambda:us-east-1:123456789012:function:my-function
Runtime: python3.11
Handler: index.handler
Description: My Lambda function
Configuration:
Memory: 256 MB
Timeout: 60 seconds
Role: arn:aws:iam::123456789012:role/lambda-role
Environment Variables:
DATABASE_URL: ********
API_KEY: ********
LOG_LEVEL: INFO
Trigger Configuration:
Type: API Gateway
Method: POST
Path: /api/handler
Output Example (GCP Cloud Function):
Function Details: my-function
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
Basic Information:
Name: my-function
Resource: projects/my-project/locations/us-central1/functions/my-function
Runtime: python311
Entry Point: handler
Generation: 2nd gen
State: ACTIVE
Configuration:
Memory: 256 MB
Timeout: 60 seconds
Service Acct: my-function@my-project.iam.gserviceaccount.com
Environment Variables:
DATABASE_URL: ********
API_KEY: ********
LOG_LEVEL: INFO
Trigger Configuration:
Type: HTTP
URL: https://us-central1-my-project.cloudfunctions.net/my-function
Ingress: ALLOW_ALL
VPC Configuration:
VPC Connector: projects/my-project/locations/us-central1/connectors/my-connector
Download Function
Download function source code for local development:
# Download function code (uses configured provider)
spartan handler download --name my-function
# Download with specific provider
spartan handler download --name my-function --provider aws
spartan handler download --name my-function --provider gcp
# Specify output path
spartan handler download --name my-function --output ./downloads/my-function.zip
# Extract ZIP file after download
spartan handler download --name my-function --extract
# Verify download integrity
spartan handler download --name my-function --check-integrity
# Save function configuration
spartan handler download --name my-function --include-config
# All options combined
spartan handler download \
--name my-function \
--output ./downloads/my-function.zip \
--extract \
--check-integrity \
--include-config
# GCP-specific: specify project and location
spartan handler download \
--name my-function \
--project-id my-project \
--location us-central1 \
--extract
# AWS-specific: specify version and region
spartan handler download \
--name my-function \
--version $LATEST \
--region us-east-1 \
--extract
Output Example:
Downloading function: my-function
โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
โ Function code downloaded
File: my-function.zip
Size: 2.4 MB
โ Integrity check passed
SHA256: a1b2c3d4e5f6...
โ Code extracted
Directory: my-function/
Files: 15
โ Configuration saved
File: my-function-config.json
Delete Handler File
Delete a local handler file:
# Delete handler file
spartan handler delete my-handler
# The command is provider-agnostic (deletes local file only)
Output Example:
โ Handler file deleted successfully
File: handlers/my_handler.py
Authentication Setup
AWS Authentication
Configure AWS credentials using one of these methods:
Option 1: AWS CLI Configuration
# Configure AWS credentials
aws configure
# Or set environment variables
export AWS_ACCESS_KEY_ID=your_access_key
export AWS_SECRET_ACCESS_KEY=your_secret_key
export AWS_DEFAULT_REGION=us-east-1
Option 2: AWS Profile
# Use a specific AWS profile
spartan handler list --profile my-profile
Required AWS Permissions:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"lambda:ListFunctions",
"lambda:GetFunction",
"lambda:GetFunctionConfiguration",
"lambda:InvokeFunction"
],
"Resource": "*"
}
]
}
GCP Authentication
Configure GCP credentials using one of these methods:
Option 1: Application Default Credentials
# Login with your user account
gcloud auth application-default login
# Or set service account credentials
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/service-account-key.json
Option 2: Service Account Key
# Download service account key from GCP Console
# Set environment variable
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/key.json
# Set project ID
export GOOGLE_CLOUD_PROJECT=your-project-id
# Set location (optional, defaults to us-central1)
export GOOGLE_CLOUD_REGION=us-central1
Required GCP Permissions:
cloudfunctions.functions.listcloudfunctions.functions.getcloudfunctions.functions.sourceCodeGetcloudfunctions.functions.call
GCP IAM Role:
# Grant Cloud Functions Developer role to service account
gcloud projects add-iam-policy-binding PROJECT_ID \
--member="serviceAccount:SERVICE_ACCOUNT_EMAIL" \
--role="roles/cloudfunctions.developer"
Configuration Examples
AWS Configuration
# .spartan file for AWS
[default]
provider = aws
# Set AWS region (optional)
export AWS_DEFAULT_REGION=us-east-1
# Use specific AWS profile (optional)
export AWS_PROFILE=my-profile
GCP Configuration
# .spartan file for GCP
[default]
provider = gcp
# Set GCP project (required)
export GOOGLE_CLOUD_PROJECT=my-project-id
# Set GCP location (optional, defaults to us-central1)
export GOOGLE_CLOUD_REGION=us-central1
# Set service account credentials (if not using ADC)
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/key.json
Handler File Templates
Spartan provides pre-built templates for different trigger types:
GCP Templates
HTTP Trigger (--subscribe http):
import functions_framework
from flask import Request
@functions_framework.http
def handler(request: Request):
"""HTTP Cloud Function entry point."""
request_json = request.get_json(silent=True)
# Your handler logic here
return {'status': 'success', 'message': 'Hello from GCP'}
Pub/Sub Trigger (--subscribe pubsub):
import base64
import functions_framework
@functions_framework.cloud_event
def handler(cloud_event):
"""Pub/Sub Cloud Function entry point."""
message_data = base64.b64decode(
cloud_event.data["message"]["data"]
).decode()
# Your handler logic here
print(f"Received message: {message_data}")
Cloud Storage Trigger (--subscribe storage):
import functions_framework
@functions_framework.cloud_event
def handler(cloud_event):
"""Cloud Storage Cloud Function entry point."""
data = cloud_event.data
bucket = data["bucket"]
name = data["name"]
# Your handler logic here
print(f"File {name} in bucket {bucket}")
AWS Templates
SQS Trigger (--subscribe sqs):
def handler(event, context):
"""AWS Lambda SQS trigger handler."""
for record in event['Records']:
message_body = record['body']
# Your handler logic here
print(f"Processing message: {message_body}")
return {'statusCode': 200, 'body': 'Success'}
API Gateway Trigger (--subscribe api):
import json
def handler(event, context):
"""AWS Lambda API Gateway handler."""
body = json.loads(event.get('body', '{}'))
# Your handler logic here
return {
'statusCode': 200,
'body': json.dumps({'message': 'Hello from Lambda'})
}
Troubleshooting
Common Issues
1. Credentials Not Configured
AWS Error:
Error: AWS credentials not configured
Suggestion: Run 'aws configure' to set up your AWS credentials.
Solution:
aws configure
# Or set environment variables
export AWS_ACCESS_KEY_ID=your_key
export AWS_SECRET_ACCESS_KEY=your_secret
GCP Error:
Error: GCP credentials not configured
Suggestion: Run 'gcloud auth application-default login' or set GOOGLE_APPLICATION_CREDENTIALS.
Solution:
gcloud auth application-default login
# Or set service account key
export GOOGLE_APPLICATION_CREDENTIALS=/path/to/key.json
2. Project ID Not Set (GCP)
Error:
Error: GCP project ID not provided
Suggestion: Set GOOGLE_CLOUD_PROJECT environment variable or pass --project-id parameter.
Solution:
export GOOGLE_CLOUD_PROJECT=my-project-id
# Or pass as parameter
spartan handler list --project-id my-project-id
3. Permission Denied
Error:
Error: Permission denied: User lacks required permissions
Solution:
- AWS: Ensure your IAM user/role has the required Lambda permissions
- GCP: Ensure your service account has the Cloud Functions Developer role or equivalent permissions
4. Function Not Found
Error:
Error: Function 'my-function' not found
Solution:
- Verify the function name is correct
- Check you're using the correct provider (
--provider awsor--provider gcp) - Verify you're in the correct region/location
- List all functions to see available names:
spartan handler list
5. Invalid Trigger Type
Error:
Error: Invalid trigger type 'invalid-trigger'
Solution:
- Check the list of valid triggers for your provider
- GCP: http, pubsub, storage, firestore, scheduler
- AWS: sqs, sns, s3, api, and many more
Debug Mode
Enable verbose output for troubleshooting:
# Set log level to debug
export SPARTAN_LOG_LEVEL=DEBUG
# Run command with verbose output
spartan handler list -v
Getting Help
# Show help for handler commands
spartan handler --help
# Show help for specific command
spartan handler create --help
spartan handler list --help
spartan handler describe --help
spartan handler download --help
Advanced Usage
Automation Scripts
#!/bin/bash
# Example: Download all functions for backup
# Get list of functions
FUNCTIONS=$(spartan handler list --output json | jq -r '.[].name')
# Download each function
for func in $FUNCTIONS; do
echo "Downloading $func..."
spartan handler download \
--name "$func" \
--output "backups/${func}.zip" \
--include-config \
--check-integrity
done
echo "Backup complete!"
Multi-Provider Development
# Create handlers for both providers
cat > .spartan << EOF
[default]
provider = aws
EOF
spartan handler create aws-handler --subscribe sqs
cat > .spartan << EOF
[default]
provider = gcp
EOF
spartan handler create gcp-handler --subscribe pubsub
# List functions from both providers
spartan handler list --provider aws
spartan handler list --provider gcp
CI/CD Integration
# GitHub Actions example
name: Deploy Handler
on:
push:
branches: [main]
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Setup Python
uses: actions/setup-python@v2
with:
python-version: '3.11'
- name: Install Spartan
run: pip install python-spartan
- name: Configure GCP
env:
GOOGLE_APPLICATION_CREDENTIALS: ${{ secrets.GCP_SA_KEY }}
GOOGLE_CLOUD_PROJECT: ${{ secrets.GCP_PROJECT_ID }}
run: |
# List deployed functions
spartan handler list
# Download function for testing
spartan handler download --name my-function --extract
Migration Guide
Migrating from AWS Lambda to GCP Cloud Functions
-
Update Configuration:
# Change provider in .spartan cat > .spartan << EOF [default] provider = gcp EOF
-
Create GCP Handler:
# Create new handler with GCP template spartan handler create my-function --subscribe http
-
Adapt Handler Code:
- Change from Lambda signature to Cloud Functions signature
- Update event handling for GCP trigger types
- Adjust environment variable access
- Update dependencies in requirements.txt
-
Test Locally:
# Use Functions Framework for local testing pip install functions-framework functions-framework --target=handler --debug
-
Deploy:
# Deploy using gcloud CLI gcloud functions deploy my-function \ --runtime python311 \ --trigger-http \ --entry-point handler \ --source .
Migrating from GCP Cloud Functions to AWS Lambda
-
Update Configuration:
# Change provider in .spartan cat > .spartan << EOF [default] provider = aws EOF
-
Create AWS Handler:
# Create new handler with AWS template spartan handler create my-function --subscribe api
-
Adapt Handler Code:
- Change from Cloud Functions signature to Lambda signature
- Update event handling for AWS trigger types
- Adjust environment variable access
- Update dependencies in requirements.txt
-
Test Locally:
# Use SAM CLI for local testing sam local invoke MyFunction
-
Deploy:
# Deploy using AWS CLI or SAM sam deploy --guided
๐ Table of Contents
- Configuration
- Workflow Commands
- Handler Commands
- Development Setup
- Available Commands
- Code Quality Tools
- Testing
- Multi-Environment Testing with Tox
- Documentation
- Configuration Details
- Development Workflow
- CI/CD Integration
- Contributing
๐ ๏ธ Development Setup
Prerequisites
- Python 3.11+
- Poetry (for dependency management)
- Git (for version control)
Initial Setup
# 1. Install dependencies
make dev-install
# 2. Setup pre-commit hooks (recommended)
make setup-hooks
# 3. Verify installation
make demo
Environment Information
# Show environment details
make env-info
# Show dependency tree
make deps-tree
# Generate requirements.txt
make requirements
๐ฏ Available Commands
Development Workflow
| Command | Description |
|---|---|
make dev-install |
Install package in development mode |
make run |
Run the CLI application |
make demo |
Run demo commands to showcase functionality |
Code Quality & Formatting
| Command | Description |
|---|---|
make format |
Format code with black and isort |
make lint |
Run linting with flake8 |
make check-format |
Check formatting without changes |
make quality |
Run all code quality checks |
Testing
| Command | Description |
|---|---|
make test |
Run tests with pytest |
make test-cov |
Run tests with coverage |
make test-fast |
Run tests without coverage (faster) |
make test-watch |
Run tests in watch mode |
make test-specific FILE=test_file.py |
Run specific test file |
Tox Multi-Environment Testing
| Command | Description |
|---|---|
make tox |
Run all tox environments |
make tox-format |
Format code via tox |
make tox-lint |
Run linting via tox |
make tox-security |
Run security checks via tox |
make tox-type-check |
Run type checking via tox |
make tox-docs |
Build documentation via tox |
make tox-coverage |
Run coverage analysis via tox |
make tox-clean |
Clean tox environments |
make tox-list |
List all tox environments |
Security & Audit
| Command | Description |
|---|---|
make security |
Run security checks |
make audit |
Audit dependencies for vulnerabilities |
Build & Install
| Command | Description |
|---|---|
make install-local |
Install package locally |
Git & CI/CD
| Command | Description |
|---|---|
make pre-commit |
Run pre-commit checks |
make pre-commit-run |
Run pre-commit hooks on all files |
make setup-hooks |
Setup git pre-commit hooks |
make ci |
Run full CI pipeline |
make ci-fast |
Run fast CI pipeline |
Cleanup
| Command | Description |
|---|---|
make clean |
Clean build artifacts |
make clean-all |
Clean everything including virtual environment |
Documentation
| Command | Description |
|---|---|
make docs |
Build Sphinx documentation |
make docs-serve |
Serve documentation locally |
make docs-clean |
Clean generated documentation |
Utility & Info
| Command | Description |
|---|---|
make size |
Show project size information |
make list-todos |
List TODO items in code |
make help |
Show all available commands |
๐ง Code Quality Tools
The project uses comprehensive code quality tools configured in pyproject.toml:
Code Formatting
- Black: Code formatting (88 character line length, Python 3.11 target)
- isort: Import sorting (black-compatible profile)
Linting & Type Checking
- Flake8: Style guide enforcement with plugins:
flake8-docstrings: Documentation styleflake8-bugbear: Bug detectionflake8-comprehensions: Comprehension improvements
- MyPy: Static type checking (lenient settings for gradual adoption)
Security & Documentation
- Bandit: Security vulnerability scanning (practical exclusions for development)
- pydocstyle: Documentation style (Google convention, lenient for gradual adoption)
Tool Configuration
All tools are configured consistently in pyproject.toml:
[tool.black]
line-length = 88
target-version = ['py311']
[tool.isort]
profile = "black"
known_first_party = ["spartan"]
[tool.mypy]
# Lenient settings for gradual adoption
ignore_missing_imports = true
disallow_untyped_defs = false
[tool.bandit]
# Security checks with practical exclusions
exclude_dirs = ["tests", "docs"]
skips = ["B101", "B110", "B311", "B324", "B404", "B603", "B607"]
[tool.pydocstyle]
convention = "google"
# Missing docstrings allowed for gradual adoption
add-ignore = ["D100", "D101", "D102", "D103", "D104", "D105", "D107"]
๐งช Testing
Test Directory Structure
Tests are organized into two main categories for better maintainability and targeted test execution:
tests/
โโโ __init__.py
โโโ conftest.py # Shared fixtures for all tests
โโโ unit/ # Unit tests
โ โโโ __init__.py
โ โโโ test_*.py # Individual component tests
โโโ integration/ # Integration tests
โโโ __init__.py
โโโ test_*.py # Multi-component and CLI tests
Unit Tests vs Integration Tests
Unit Tests (tests/unit/)
- Test individual components in isolation
- Mock external dependencies (file system, network, databases)
- Execute quickly (milliseconds)
- Focus on specific methods and functions
- Example: Testing a single service class with mocked dependencies
Integration Tests (tests/integration/)
- Test interactions between multiple components
- Test CLI initialization and command execution
- Test end-to-end workflows
- May use real file system operations (in temp directories)
- Example: Testing CLI command execution with real config files
Running Tests by Category
# Run all tests
make test-fast
# Run only unit tests
pytest tests/unit
# Run only integration tests
pytest tests/integration
# Run specific test file
pytest tests/unit/test_config_service.py
# Run with verbose output
pytest tests/unit -v
# Run with coverage
make test-cov
# Run unit tests with coverage
pytest tests/unit --cov=spartan --cov-report=term-missing
Test Configuration
Tests are configured in pyproject.toml with pytest:
[tool.pytest.ini_options]
minversion = "6.0"
addopts = "-ra -q --strict-markers"
testpaths = ["tests"]
markers = [
"slow: marks tests as slow (deselect with '-m \"not slow\"')",
"integration: marks tests as integration tests",
"unit: marks tests as unit tests",
]
Coverage Configuration
Coverage is configured to focus on source code:
[tool.coverage.run]
source = ["spartan"]
omit = [
"*/tests/*",
"*/test_*",
"*/__pycache__/*",
"*/venv/*",
"*/.venv/*",
]
Running Tests
# Quick test run (all tests)
make test-fast
# Full tests with coverage
make test-cov
# Run only unit tests
pytest tests/unit
# Run only integration tests
pytest tests/integration
# Specific test file
make test-specific FILE=test_example.py
# Watch mode for development
make test-watch
# Run with verbose output
pytest -v
# Run unit tests with coverage
pytest tests/unit --cov=spartan --cov-report=html
Writing New Tests
When creating a new test, place it in the appropriate directory:
Place in tests/unit/ if:
- Testing a single component/class in isolation
- Mocking all external dependencies
- Testing utility functions or helper methods
- Testing data models and validation logic
Place in tests/integration/ if:
- Testing CLI command execution
- Testing interactions between multiple services
- Testing file system operations with real files
- Testing configuration loading and initialization
- Testing end-to-end workflows
For more details on testing guidelines, see .kiro/steering/testing-guidelines.md.
๐๏ธ Multi-Environment Testing with Tox
Tox provides isolated testing environments for comprehensive validation:
Available Tox Environments
# List all environments
make tox-list
Testing Environments:
py311,py312,py313- Python version testinglint- Flake8 linting with all pluginsformat- Black and isort formattingformat-check- Check formatting without changessecurity- Bandit security scanningtype-check- MyPy static type checkingcoverage- Test coverage reportingdocs- Sphinx documentation buildingpre-commit- Run all pre-commit hooksclean- Clean build artifacts
Directory Exclusions
All tox environments properly exclude build and virtual environment directories:
.venv,.tox,.git__pycache__,build,dist.eggs,*.egg-info
Tox Usage Examples
# Run all environments
make tox
# Run specific environment
poetry run tox -e lint
# Run linting with proper exclusions
make tox-lint
# Check code formatting
make tox-format-check
# Run security scanning
make tox-security
๐ Documentation
Building Documentation
The project uses Sphinx for documentation generation:
# Build documentation
make docs
# Serve documentation locally (http://localhost:8000)
make docs-serve
# Clean documentation
make docs-clean
# Build via tox (isolated environment)
make tox-docs
Documentation Dependencies
# Documentation tools in pyproject.toml
sphinx = "^8.1.3"
sphinx-rtd-theme = "^3.0.2"
sphinx-autoapi = "^3.3.3"
myst-parser = "^4.0.0"
โ๏ธ Configuration Details
Project Structure
spartan/
โโโ spartan/ # Main package
โ โโโ __init__.py
โ โโโ main.py # CLI entry point
โ โโโ services/ # Service modules
โโโ tests/ # Test suite
โโโ docs/ # Sphinx documentation
โโโ pyproject.toml # Main configuration
โโโ tox.ini # Multi-environment testing
โโโ Makefile # Development commands
โโโ README.md # This file
Dependencies
Core Dependencies:
typer: CLI frameworkrich: Rich terminal outputboto3: AWS SDKpandas: Data manipulationpyarrow: Parquet support
Development Dependencies:
- Testing:
pytest,pytest-cov,pytest-mock,faker - Code Quality:
black,isort,flake8,mypy,bandit - Documentation:
sphinx,sphinx-rtd-theme - Tools:
pre-commit,commitizen,tox
Configuration Files Alignment
All configuration files are aligned for consistency:
| Tool | Configuration File | Settings Source |
|---|---|---|
| Black | pyproject.toml |
[tool.black] |
| isort | pyproject.toml |
[tool.isort] |
| Flake8 | pyproject.toml |
Via tox/make commands |
| MyPy | pyproject.toml |
[tool.mypy] |
| Bandit | pyproject.toml |
[tool.bandit] |
| Pytest | pyproject.toml |
[tool.pytest.ini_options] |
| Coverage | pyproject.toml |
[tool.coverage.*] |
| Commitizen | pyproject.toml |
[tool.commitizen] |
๐ Development Workflow
Daily Development
-
Initial Setup (first time):
make dev-install make setup-hooks
-
Code Development:
- Write code in
spartan/ - Write tests in
tests/ - Pre-commit hooks run automatically on commit
- Write code in
-
Manual Quality Checks:
# Quick quality check make quality # Run tests make test-fast # Full quality + tests make pre-commit
-
Multi-Environment Validation:
# Test across Python versions make tox # Specific environment testing make tox-lint make tox-security
Command Equivalence
You can achieve the same results through different paths:
# Direct via Poetry
poetry run black spartan tests
poetry run flake8 spartan tests
poetry run mypy spartan
# Via tox (isolated environment)
poetry run tox -e format
poetry run tox -e lint
poetry run tox -e type-check
# Via make (convenient aliases)
make format
make lint
make tox-type-check
๐ CI/CD Integration
CI Pipeline Options
# Fast CI pipeline (for PRs)
make ci-fast
# Includes: dev-install + quality + test-fast
# Full CI pipeline (for main branch)
make ci
# Includes: dev-install + quality + test-cov
# Comprehensive testing (for releases)
make tox
# Tests across Python 3.11, 3.12, 3.13
Pre-commit Integration
All quality tools are integrated into pre-commit hooks:
# Setup hooks (run once)
make setup-hooks
# Manual run of all hooks
make pre-commit-run
# Hooks run automatically on commit
git commit -m "Your commit message"
Benefits of the Setup
โ Consistency: All tools use the same configuration โ Flexibility: Run tools via Poetry, tox, or make โ Isolation: Tox provides clean environments โ CI Integration: Multiple testing strategies โ Developer Experience: Simple, memorable commands โ Gradual Adoption: Lenient settings for incremental improvement
๐ค Contributing
- Fork the repository
- Create a feature branch:
git checkout -b feature-name - Set up development environment:
make dev-install && make setup-hooks - Make your changes with tests
- Run quality checks:
make quality - Run tests:
make test-cov - Optional: Run tox for comprehensive testing:
make tox - Commit your changes (pre-commit hooks will run automatically)
- Push to your fork:
git push origin feature-name - Create a Pull Request
Code Quality Requirements
- All code must pass
make quality(formatting + linting) - All tests must pass
make test-cov - Coverage should be maintained or improved
- Follow the existing code style and patterns
- Add tests for new functionality
- Update documentation as needed
Optional but Recommended
- Run
make toxfor multi-environment testing - Check security with
make tox-security - Validate types with
make tox-type-check
Note: This project follows a comprehensive development workflow with multiple layers of quality assurance. The configuration is designed to be both strict enough to ensure quality and flexible enough to support gradual adoption of best practices.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file python_spartan-0.3.9.tar.gz.
File metadata
- Download URL: python_spartan-0.3.9.tar.gz
- Upload date:
- Size: 232.2 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.5 CPython/3.11.11 Darwin/24.3.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
b38611790f030fe6697c488a149d9085c41ff9b8cf8680818ea6dcb1fadb65f9
|
|
| MD5 |
9df4c5e82eb73c5b3c3aedc7ce2f0fc1
|
|
| BLAKE2b-256 |
7c4f0b9c79d73a226f509a73bf5f821e050ac3083ff2d4dafcd1363a65b7af10
|
File details
Details for the file python_spartan-0.3.9-py3-none-any.whl.
File metadata
- Download URL: python_spartan-0.3.9-py3-none-any.whl
- Upload date:
- Size: 265.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/1.8.5 CPython/3.11.11 Darwin/24.3.0
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ef0b026f74ac9a17dc214610057351132afcd76442a54fba2aaf235529679a50
|
|
| MD5 |
633d9ea967d0a297b6fc37d8c2175425
|
|
| BLAKE2b-256 |
02c38a84ef550f794525e95fee7aba5acb00f8a3f901ebace1b995ab76fd13d4
|