PostgreSQL upsert engine using temp tables and automatic conflict resolution
Project description
pgsql_upserter
A powerful, production-ready PostgreSQL upsert utility with automatic schema introspection and intelligent conflict resolution. Perfect for serverless ETL pipelines and data integration workflows.
🚀 Key Features
- Zero Configuration: Automatic schema detection and column matching
- Intelligent Conflict Resolution: Automatically detects primary keys and unique constraints
- Production Tested: Handles deduplication, data validation, and error recovery
- Flexible Input: Supports both direct data (API responses) and CSV files
📦 Installation
pip install pgsql-upserter
🎯 Quick Start
Serverless ETL (Recommended)
Perfect for AWS Lambda, Google Cloud Functions, or any API-driven ETL:
from pgsql_upserter import execute_upsert_workflow, create_connection_from_env
# Your API response data (Facebook Ads, Google Ads, etc.)
api_data = [
{
'account_id': '123456789',
'campaign_id': 'camp_001',
'impressions': 1000,
'clicks': 50,
'spend': 25.50,
'date_start': '2025-08-31'
}
# ... more records
]
# One function call does everything!
connection = create_connection_from_env()
result = execute_upsert_workflow(
connection=connection,
data=api_data, # Direct API data
target_table='ads_metrics'
)
print(f"✅ {result.total_affected} rows processed")
print(f"📈 {result.rows_inserted} inserted, {result.rows_updated} updated")
CSV File Processing
# Automatic CSV processing
result = execute_upsert_workflow(
connection=connection,
data='path/to/data.csv', # File path
target_table='ads_metrics'
)
🔧 Environment Setup
Set your PostgreSQL connection via environment variables:
export PGHOST=your-host
export PGPORT=5432
export PGDATABASE=your-db
export PGUSER=your-user
export PGPASSWORD=your-password
Or use a connection string:
export DATABASE_URL=postgresql://user:pass@host:port/dbname
🧠 How It Works
- Schema Introspection: Analyzes your table structure automatically
- Column Matching: Maps your data columns to table columns
- Conflict Detection: Finds primary keys and unique constraints
- Data Deduplication: Removes duplicates using conflict resolution strategy
- Intelligent Upsert: Uses PostgreSQL's native
INSERT...ON CONFLICT
🎯 Perfect For
- API Data Ingestion: Facebook Ads, Google Ads, LinkedIn Ads APIs
- Serverless ETL: AWS Lambda, Google Cloud Functions, Azure Functions
- Data Warehousing: Loading data into analytics databases
- Real-time Sync: Keeping databases in sync with external sources
- Batch Processing: Traditional CSV and file-based workflows
📊 Automatic Conflict Resolution
The library automatically chooses the best upsert strategy:
- Primary Key: Uses table's primary key if available in data
- Unique Constraints: Combines all unique constraints for conflict detection
- Insert Only: Falls back to simple insert if no conflicts possible
🔍 Advanced Usage
Data Processing Before Upsert
from pgsql_upserter import UpsertResult
# Read and process CSV data
csv_data = UpsertResult.read_csv_to_dict_list('data.csv')
# Filter or transform data
filtered_data = [row for row in csv_data if float(row.get('spend', 0)) > 10.0]
# Upsert processed data
result = execute_upsert_workflow(
connection=connection,
data=filtered_data,
target_table='ads_metrics'
)
Custom Connection
import psycopg2
from pgsql_upserter import execute_upsert_workflow
connection = psycopg2.connect(
host="localhost",
database="mydb",
user="user",
password="password"
)
result = execute_upsert_workflow(
connection=connection,
data=your_data,
target_table='your_table',
schema_name='public' # optional, defaults to 'public'
)
🛡️ Error Handling
The library provides comprehensive error handling and validation:
from pgsql_upserter import execute_upsert_workflow, PgsqlUpserterError
try:
result = execute_upsert_workflow(connection, data, 'my_table')
print(f"Success: {result.total_affected} rows processed")
except PgsqlUpserterError as e:
print(f"Upsert failed: {e}")
📋 Requirements
- Python 3.11-3.14
- PostgreSQL 12+
- psycopg2-binary
🤝 Contributing
Issues and pull requests are welcome! Please see our contributing guidelines.
📄 License
MIT License - see LICENSE file for details.
🔗 Links
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file pgsql_upserter-0.9.3.tar.gz.
File metadata
- Download URL: pgsql_upserter-0.9.3.tar.gz
- Upload date:
- Size: 17.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.4 CPython/3.12.3 Linux/5.15.167.4-microsoft-standard-WSL2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
607f3a62715eded1e1c21d89848a4c369a68be3fa78cd090c77b069e5c960e43
|
|
| MD5 |
cb2f8e958cbbf1cf27b5e7946bb9a892
|
|
| BLAKE2b-256 |
da5e4800a1b2fb700273a6c04c1fa04c432e24f6fee58ca553fa652834c4fb60
|
File details
Details for the file pgsql_upserter-0.9.3-py3-none-any.whl.
File metadata
- Download URL: pgsql_upserter-0.9.3-py3-none-any.whl
- Upload date:
- Size: 20.3 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: poetry/2.1.4 CPython/3.12.3 Linux/5.15.167.4-microsoft-standard-WSL2
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cc21f312012f4e8488110062972f1c79e343ff251062e82539d59a4ff07b9405
|
|
| MD5 |
bf0b62465d7f64f8ef6174423025e20c
|
|
| BLAKE2b-256 |
bac1035d032dc59232df4676fe6d4cada50d2b09596312c90710400da40abf10
|