Skip to main content

Instruction Stack Audit Framework - Automatic compliance logging for AI systems

Project description

ISAF Logger

Instruction Stack Audit Framework - Automatic compliance logging for AI systems

PyPI version Python 3.8+ License: MIT

Add 3 lines of code, get EU AI Act-ready documentation with cryptographic verification.

Quick Start

import isaf

# Initialize (one line)
isaf.init()

# Add decorators to your training functions
@isaf.log_data(source="customer_data", version="3.2.1")
def load_training_data():
    return pd.read_csv("data.csv")

@isaf.log_objective(name="binary_crossentropy", constraints=["fairness < 0.05"])
def train_model(data):
    model = create_model()
    model.fit(data)
    return model

# Log inference with human oversight (EU AI Act Article 14)
@isaf.log_inference(threshold=0.5, human_oversight=True, model_version="1.0.0")
def predict(input_data):
    return model.predict(input_data)

# Run training and inference as normal
data = load_training_data()
model = train_model(data)
predictions = predict(test_data)

# Export compliance report (one line)
isaf.export("compliance_report.json")

Installation

pip install isaf-logger

Features

  • 3 Lines of Code: Minimal integration with existing ML pipelines
  • Full Stack Coverage: Logs Layers 6-9 (Framework, Data, Objectives, Deployment)
  • Cryptographic Verification: SHA-256 hash chains prove lineage integrity
  • Compliance Ready: Maps to EU AI Act, NIST AI RMF, ISO 42001, Colorado AI Act
  • Framework Agnostic: Works with PyTorch, TensorFlow, JAX, scikit-learn
  • Flexible Storage: SQLite for local, MLflow for production

What Gets Logged

Layer 6: ML Framework

  • Framework versions (PyTorch, TensorFlow, etc.)
  • CUDA availability and configuration
  • Default parameters and numerical precision
  • System environment (Python version, OS, processor)

Layer 7: Training Data

  • Data source and version
  • Dataset shape, dtypes, missing values
  • Data hash for provenance tracking
  • Preprocessing operations

Layer 8: Objective Function

  • Loss function name and mathematical form
  • Constraints and regularization terms
  • Hyperparameters (learning rate, batch size, etc.)
  • Business justification

Layer 9: Deployment/Inference (NEW)

  • Decision thresholds and confidence cutoffs
  • Human oversight configuration
  • Model version and deployment environment
  • Inference mode (single, batch, streaming)
  • Fallback actions and escalation rules

Compliance Mappings

ISAF automatically maps your logged data to regulatory requirements:

  • EU AI Act: Article 10 (Data Governance), Article 11 (Technical Documentation)
  • NIST AI RMF: MEASURE-2.2, GOVERN-1.1
  • ISO 42001: Section 8.4 (Control of externally provided AI)
  • Colorado AI Act: SB24-205 (Impact Assessment Documentation)

CLI Tools

# Inspect lineage file
isaf inspect compliance_report.json

# Verify cryptographic integrity
isaf verify compliance_report.json

# Export from database
isaf export-from-db lineage.db --output report.json

# List sessions
isaf list-sessions lineage.db

Advanced Usage

Custom Storage Backend

# SQLite (default)
isaf.init(backend='sqlite', db_path='my_lineage.db')

# MLflow
isaf.init(backend='mlflow', tracking_uri='http://localhost:5000')

# Memory only (testing)
isaf.init(backend='memory')

Compliance Export

# Export with compliance mappings
isaf.export(
    'compliance_report.json',
    include_hash_chain=True,
    compliance_mappings=['eu_ai_act', 'nist_ai_rmf', 'iso_42001']
)

Verification

# Verify lineage integrity
verified = isaf.verify_lineage('compliance_report.json')
print(f"Verification: {'PASSED' if verified else 'FAILED'}")

Examples

PyTorch Example

import torch
import isaf

isaf.init()

@isaf.log_data(source='internal', version='1.0')
def load_data():
    return torch.utils.data.TensorDataset(X, y)

@isaf.log_objective(name='cross_entropy')
def train(model, data):
    optimizer = torch.optim.Adam(model.parameters())
    for epoch in range(10):
        # training loop
        pass
    return model

data = load_data()
model = train(model, data)
isaf.export('pytorch_lineage.json')

scikit-learn Example

from sklearn.ensemble import RandomForestClassifier
import isaf

isaf.init()

@isaf.log_data(source='synthetic', version='1.0')
def load_data():
    from sklearn.datasets import make_classification
    return make_classification(n_samples=1000, n_features=20)

@isaf.log_objective(name='gini_impurity', constraints=['max_depth=10'])
def train_model(X, y):
    model = RandomForestClassifier(max_depth=10)
    model.fit(X, y)
    return model

X, y = load_data()
model = train_model(X, y)
isaf.export('sklearn_lineage.json')

Inference with Human Oversight

import isaf

isaf.init()

# Log inference with EU AI Act Article 14 compliance
@isaf.log_inference(
    threshold=0.5,
    human_oversight=True,
    review_threshold=0.7,  # Flag for human review below this confidence
    model_version="2.0.0",
    model_name="loan_classifier",
    fallback_action="flag"  # What to do when confidence is low
)
def classify_loan_application(application_data):
    prediction = model.predict(application_data)
    confidence = model.predict_proba(application_data).max()
    return {'prediction': prediction, 'confidence': confidence}

# Multi-class thresholds
@isaf.log_inference(
    thresholds={'approve': 0.8, 'deny': 0.9, 'review': 0.5},
    human_oversight=True,
    inference_mode='batch'
)
def batch_classify(applications):
    return model.predict(applications)

result = classify_loan_application(new_application)
isaf.export('inference_lineage.json', compliance_mappings=['eu_ai_act'])

Documentation

Contributing

Contributions welcome! Please read CONTRIBUTING.md first.

License

MIT License - see LICENSE file for details.

Citation

If you use ISAF Logger in your research, please cite:

@software{isaf_logger,
  title = {ISAF Logger: Instruction Stack Audit Framework},
  author = {HAIEC Lab},
  year = {2025},
  url = {https://github.com/haiec/isaf-logger}
}

Support


Built by HAIEC - Human AI Ethics & Compliance

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

isaf_logger-0.2.0.tar.gz (36.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

isaf_logger-0.2.0-py3-none-any.whl (28.7 kB view details)

Uploaded Python 3

File details

Details for the file isaf_logger-0.2.0.tar.gz.

File metadata

  • Download URL: isaf_logger-0.2.0.tar.gz
  • Upload date:
  • Size: 36.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for isaf_logger-0.2.0.tar.gz
Algorithm Hash digest
SHA256 3333b31d1336dab852a086686ae012598432c2ab61f3699544f8a9eb0a2a3286
MD5 36c324970219118a773ee9fd553950e3
BLAKE2b-256 72bc79f9a9b297388cd1c016b1cee6e61f4be8463634c7bec565576808801918

See more details on using hashes here.

File details

Details for the file isaf_logger-0.2.0-py3-none-any.whl.

File metadata

  • Download URL: isaf_logger-0.2.0-py3-none-any.whl
  • Upload date:
  • Size: 28.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.14.2

File hashes

Hashes for isaf_logger-0.2.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b527dace95163518eed3f5e7756c9962fc404195e3dee0d10c2ffaeeefa8fa58
MD5 ece8007b7ef8582792ea3c0a143594db
BLAKE2b-256 9d23c740572fb5f45bf91b7c1d1c8eb9d597d52d087c2ecad340ba03a8e9424e

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page