Skip to main content

Add your description here

Project description

Postgres Database - Manning Capital

A Python package containing SQLAlchemy ORM models for a PostgreSQL database that powers a personal quantitative trading and investment analysis platform.

Overview

This package provides SQLAlchemy ORM models and database utilities for managing financial data, trading strategies, portfolio analytics, and market research. The database serves as the backbone for a personal "quant hedge fund" project, storing everything from market data and content data.

All models are defined in src/mc_postgres_db/models.py. See the model definitions for detailed field descriptions and relationships.

Installation

From PyPI

pip install mc-postgres-db

From Source

# Clone the repository
git clone <repository-url>
cd mc-postgres-db

# Install using uv (recommended)
uv sync

Testing Dependencies

For testing, you'll also need Docker installed and running:

# Check if Docker is installed and running
docker --version
docker ps

Database Setup

  1. PostgreSQL Setup: Ensure PostgreSQL is installed and running
  2. Environment Variables: Set up your database connection string
    export SQLALCHEMY_DATABASE_URL="postgresql://username:password@localhost:5432/mc_trading_db"
    

Usage Examples

Basic Queries

from sqlalchemy import create_engine, select
from sqlalchemy.orm import Session
from mc_postgres_db.models import Asset, Provider, ProviderAssetMarket

# Create database connection
url = "postgresql://username:password@localhost:5432/mc_trading_db"
engine = create_engine(url)

# Query assets
with Session(engine) as session:
    stmt = select(Asset).where(Asset.is_active)
    assets = session.scalars(stmt).all()
    for asset in assets:
        print(f"{asset.id}: {asset.name}")

# Query market data
with Session(engine) as session:
    stmt = (
        select(ProviderAssetMarket)
        .where(
            ProviderAssetMarket.from_asset_id == 1,
            ProviderAssetMarket.to_asset_id == 2,
            ProviderAssetMarket.provider_id == 3,
        )
        .order_by(ProviderAssetMarket.timestamp.desc())
        .limit(10)
    )
    market_data = session.scalars(stmt).all()
    for data in market_data:
        print(f"Timestamp: {data.timestamp}, Close: {data.close}, Volume: {data.volume}")

Efficient Relationship Loading

The ORM models are optimized for efficient querying using SQLAlchemy's joinedload:

from sqlalchemy.orm import Session, joinedload
from mc_postgres_db.models import PortfolioTransaction, TransactionStatus

with Session(engine) as session:
    transaction = session.query(PortfolioTransaction).options(
        joinedload(PortfolioTransaction.transaction_type),
        joinedload(PortfolioTransaction.portfolio),
        joinedload(PortfolioTransaction.statuses).joinedload(TransactionStatus.transaction_status_type)
    ).filter_by(id=1).first()
    
    print(f"Transaction: {transaction.transaction_type.name}")
    print(f"Portfolio: {transaction.portfolio.name}")
    print("Status History:")
    for status in transaction.statuses:
        print(f"  {status.timestamp}: {status.transaction_status_type.name}")

Creating Records

from sqlalchemy.orm import Session
from mc_postgres_db.models import Portfolio, TransactionType, PortfolioTransaction
from datetime import datetime

with Session(engine) as session:
    # Create a portfolio
    portfolio = Portfolio(
        name="Main Trading Portfolio",
        description="Primary portfolio for active trading strategies",
        is_active=True
    )
    session.add(portfolio)
    session.flush()
    
    # Create transaction type
    buy_type = TransactionType(
        symbol="BUY",
        name="Buy",
        description="Purchase of an asset",
        is_active=True
    )
    session.add(buy_type)
    session.flush()
    
    # Create a transaction
    transaction = PortfolioTransaction(
        timestamp=datetime.now(),
        transaction_type_id=buy_type.id,
        portfolio_id=portfolio.id,
        from_asset_id=2,  # USD (cash)
        to_asset_id=1,    # Bitcoin
        quantity=0.5,
        price=50000.0
    )
    session.add(transaction)
    session.commit()

Testing Utilities

This package provides a robust testing harness for database-related tests using a temporary PostgreSQL database in Docker.

Using postgres_test_harness

The postgres_test_harness context manager creates a temporary PostgreSQL database and initializes all ORM models. It can integrate with Prefect or be used independently.

Key features:

  • Creates a fresh database for each test (ephemeral storage)
  • Integrates with Prefect (optional) - all get_engine() calls use the test DB
  • Comprehensive safety checks to prevent accidental connection to production
  • Automatic cleanup after tests

Usage with Prefect

import pytest
from mc_postgres_db.testing.utilities import postgres_test_harness

@pytest.fixture(scope="function", autouse=True)
def postgres_harness():
    with postgres_test_harness():
        yield

def test_my_prefect_flow():
    # Any Prefect task that calls get_engine() will use the PostgreSQL test DB
    ...

Usage without Prefect

import pytest
from sqlalchemy import Engine, text
from sqlalchemy.orm import Session
from mc_postgres_db.testing.utilities import postgres_test_harness
from mc_postgres_db.models import AssetType

@pytest.fixture
def db_engine():
    """Fixture that provides a database engine without Prefect."""
    with postgres_test_harness(use_prefect=False) as engine:
        yield engine

def test_create_asset_type(db_engine: Engine):
    """Test creating an asset type."""
    with Session(db_engine) as session:
        asset_type = AssetType(
            name="Test Asset Type",
            description="Test Description"
        )
        session.add(asset_type)
        session.commit()
        
        assert asset_type.id is not None
        assert asset_type.is_active is True

Test Organization

Tests are organized into two directories:

  • tests/with_prefect/: Tests that use Prefect
  • tests/no_prefect/: Tests that don't use Prefect

Development

Setting up Development Environment

# Install development dependencies
uv sync --dev

# Run tests
uv run pytest

# Run linting
uv run ruff check
uv run ruff format

Database Migrations

This project uses Alembic for database migrations.

Creating a new migration:

# Generate new migration from model changes
uv run alembic revision --autogenerate -m "Description of changes"

# Or create an empty migration
uv run alembic revision -m "Description of changes"

Applying migrations:

# Apply all pending migrations
uv run alembic upgrade head

# Apply migrations one at a time
uv run alembic upgrade +1

# Rollback one migration
uv run alembic downgrade -1

# Rollback to a specific revision
uv run alembic downgrade <revision_id>

Best practices:

  • Always review auto-generated migrations before committing
  • Test migrations on a copy of production data when possible
  • Include both upgrade() and downgrade() functions
  • Add descriptive comments to migration files

Contributing

This is a personal project, but suggestions and improvements are welcome:

  1. Fork the repository
  2. Create a feature branch
  3. Make your changes with tests
  4. Ensure migrations are properly created and tested
  5. Submit a pull request

License

This project is for personal use and learning purposes.

Disclaimer

This software is for educational and personal use only. It is not intended for production trading or investment advice. Use at your own risk.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

mc_postgres_db-1.4.3.tar.gz (18.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

mc_postgres_db-1.4.3-py3-none-any.whl (17.1 kB view details)

Uploaded Python 3

File details

Details for the file mc_postgres_db-1.4.3.tar.gz.

File metadata

  • Download URL: mc_postgres_db-1.4.3.tar.gz
  • Upload date:
  • Size: 18.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.18 {"installer":{"name":"uv","version":"0.9.18","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for mc_postgres_db-1.4.3.tar.gz
Algorithm Hash digest
SHA256 6f0a3dd0271d3888d8aa0dcf3560bfa26d6f64a3c4b7ccb7feee97cc4f3d5cd2
MD5 a47e54b2e8c2a7aa07d190220cfc15b3
BLAKE2b-256 1aa5b1435ce6e2a1b661bd07ce08a8adbc8a5c8f220b43f149312d3fec8dd69e

See more details on using hashes here.

File details

Details for the file mc_postgres_db-1.4.3-py3-none-any.whl.

File metadata

  • Download URL: mc_postgres_db-1.4.3-py3-none-any.whl
  • Upload date:
  • Size: 17.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: uv/0.9.18 {"installer":{"name":"uv","version":"0.9.18","subcommand":["publish"]},"python":null,"implementation":{"name":null,"version":null},"distro":{"name":"Ubuntu","version":"24.04","id":"noble","libc":null},"system":{"name":null,"release":null},"cpu":null,"openssl_version":null,"setuptools_version":null,"rustc_version":null,"ci":true}

File hashes

Hashes for mc_postgres_db-1.4.3-py3-none-any.whl
Algorithm Hash digest
SHA256 3ca117eae8af5d2d3b9c09816a19fa33dc5e890fc9ef31e7e25b933bdc8a3d00
MD5 772edf8eca95648f8c433e51fc054415
BLAKE2b-256 5cb3e9218208dee8e5934b99be57de258a4e7728bf02693ac072e979255c1879

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page