Skip to main content

Instruction Stack Audit Framework - Automatic compliance logging for AI systems

Project description

ISAF Logger

Instruction Stack Audit Framework - Automatic compliance logging for AI systems

PyPI version Python 3.8+ License: MIT

Add 3 lines of code, get EU AI Act-ready documentation with cryptographic verification.

Quick Start

import isaf

# Initialize (one line)
isaf.init()

# Add decorators to your training functions
@isaf.log_data(source="customer_data", version="3.2.1")
def load_training_data():
    return pd.read_csv("data.csv")

@isaf.log_objective(name="binary_crossentropy", constraints=["fairness < 0.05"])
def train_model(data):
    model = create_model()
    model.fit(data)
    return model

# Run training as normal
data = load_training_data()
model = train_model(data)

# Export compliance report (one line)
isaf.export("compliance_report.json")

Installation

pip install isaf-logger

Features

  • 3 Lines of Code: Minimal integration with existing ML pipelines
  • Full Stack Coverage: Logs Layer 6 (Framework), Layer 7 (Data), Layer 8 (Objectives)
  • Cryptographic Verification: SHA-256 hash chains prove lineage integrity
  • Compliance Ready: Maps to EU AI Act, NIST AI RMF, ISO 42001, Colorado AI Act
  • Framework Agnostic: Works with PyTorch, TensorFlow, JAX, scikit-learn
  • Flexible Storage: SQLite for local, MLflow for production

What Gets Logged

Layer 6: ML Framework

  • Framework versions (PyTorch, TensorFlow, etc.)
  • CUDA availability and configuration
  • Default parameters and numerical precision
  • System environment (Python version, OS, processor)

Layer 7: Training Data

  • Data source and version
  • Dataset shape, dtypes, missing values
  • Data hash for provenance tracking
  • Preprocessing operations

Layer 8: Objective Function

  • Loss function name and mathematical form
  • Constraints and regularization terms
  • Hyperparameters (learning rate, batch size, etc.)
  • Business justification

Compliance Mappings

ISAF automatically maps your logged data to regulatory requirements:

  • EU AI Act: Article 10 (Data Governance), Article 11 (Technical Documentation)
  • NIST AI RMF: MEASURE-2.2, GOVERN-1.1
  • ISO 42001: Section 8.4 (Control of externally provided AI)
  • Colorado AI Act: SB24-205 (Impact Assessment Documentation)

CLI Tools

# Inspect lineage file
isaf inspect compliance_report.json

# Verify cryptographic integrity
isaf verify compliance_report.json

# Export from database
isaf export-from-db lineage.db --output report.json

# List sessions
isaf list-sessions lineage.db

Advanced Usage

Custom Storage Backend

# SQLite (default)
isaf.init(backend='sqlite', db_path='my_lineage.db')

# MLflow
isaf.init(backend='mlflow', tracking_uri='http://localhost:5000')

# Memory only (testing)
isaf.init(backend='memory')

Automation Rules

# Create automation rule
await ks.createRule({
    'metricName': 'accuracy',
    'thresholdValue': 0.70,
    'thresholdOperator': '<',
    'layer1Action': 'throttle_50',
    'minDurationSeconds': 30
})

Compliance Export

# Export with compliance mappings
isaf.export(
    'compliance_report.json',
    include_hash_chain=True,
    compliance_mappings=['eu_ai_act', 'nist_ai_rmf', 'iso_42001']
)

Verification

# Verify lineage integrity
verified = isaf.verify_lineage('compliance_report.json')
print(f"Verification: {'PASSED' if verified else 'FAILED'}")

Examples

PyTorch Example

import torch
import isaf

isaf.init()

@isaf.log_data(source='internal', version='1.0')
def load_data():
    return torch.utils.data.TensorDataset(X, y)

@isaf.log_objective(name='cross_entropy')
def train(model, data):
    optimizer = torch.optim.Adam(model.parameters())
    for epoch in range(10):
        # training loop
        pass
    return model

data = load_data()
model = train(model, data)
isaf.export('pytorch_lineage.json')

scikit-learn Example

from sklearn.ensemble import RandomForestClassifier
import isaf

isaf.init()

@isaf.log_data(source='synthetic', version='1.0')
def load_data():
    from sklearn.datasets import make_classification
    return make_classification(n_samples=1000, n_features=20)

@isaf.log_objective(name='gini_impurity', constraints=['max_depth=10'])
def train_model(X, y):
    model = RandomForestClassifier(max_depth=10)
    model.fit(X, y)
    return model

X, y = load_data()
model = train_model(X, y)
isaf.export('sklearn_lineage.json')

Documentation

Contributing

Contributions welcome! Please read CONTRIBUTING.md first.

License

MIT License - see LICENSE file for details.

Citation

If you use ISAF Logger in your research, please cite:

@software{isaf_logger,
  title = {ISAF Logger: Instruction Stack Audit Framework},
  author = {HAIEC Lab},
  year = {2025},
  url = {https://github.com/haiec/isaf-logger}
}

Support


Built by HAIEC - Human AI Ethics & Compliance

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

isaf_logger-0.1.0.tar.gz (21.5 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

isaf_logger-0.1.0-py3-none-any.whl (22.7 kB view details)

Uploaded Python 3

File details

Details for the file isaf_logger-0.1.0.tar.gz.

File metadata

  • Download URL: isaf_logger-0.1.0.tar.gz
  • Upload date:
  • Size: 21.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for isaf_logger-0.1.0.tar.gz
Algorithm Hash digest
SHA256 80d29a78539a0362e27ed2c6bcf54ee866655d8d5047676948074650cef575c1
MD5 adeb15a7b76b1573433a29bcf48362c2
BLAKE2b-256 8338b09eb4bc96557eb5911a1cf2b1cb09fc653f924e26d6ed78cf43d979bcac

See more details on using hashes here.

File details

Details for the file isaf_logger-0.1.0-py3-none-any.whl.

File metadata

  • Download URL: isaf_logger-0.1.0-py3-none-any.whl
  • Upload date:
  • Size: 22.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/6.2.0 CPython/3.12.10

File hashes

Hashes for isaf_logger-0.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 9ef8757be481ba5b7fa824e5cda3f78f18347b52fede6a1375b165be289af831
MD5 222505e0bfaedb67c9465cd85b6874ba
BLAKE2b-256 79a9311e7f31f357d522357afdeb662c382ec860d65db4f403442a3f1f17b41c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page