Skip to main content

A python SDK for Deep Learning Backtrace

Project description

AryaXai-Backtrace

Backtrace module for Generating Explainability on Deep learning models using TensorFlow / Pytorch

Backtrace Module

License

Overview

The Backtrace Module is a powerful and patent-pending algorithm developed by AryaXAI for enhancing the explainability of AI models, particularly in the context of complex techniques like deep learning.

Features

  • Explainability: Gain deep insights into your AI models by using the Backtrace algorithm, providing multiple explanations for their decisions.

  • Consistency: Ensure consistent and accurate explanations across different scenarios and use cases.

  • Mission-Critical Support: Tailored for mission-critical AI use cases where transparency is paramount.

Installation

To integrate the Backtrace Module into your project, follow these simple steps:

pip install dl-backtrace

Usage

Tensoflow-Keras based models

from dl_backtrace.tf_backtrace import Backtrace as B

Pytorch based models

from dl_backtrace.pytorch_backtrace import Backtrace as B

Evalauting using Backtrace:

  1. Step - 1: Initialize a Backtrace Object using your Model
backtrace = B(model=model)
  1. Step - 2: Calculate layer-wise output using a data instance
layer_outputs = backtrace.predict(test_data[0])
  1. Step - 3: Calculate layer-wise Relevance using Evaluation
relevance = backtrace.eval(layer_outputs,mode='default',scaler=1,thresholding=0.5,task="binary-classification")

Depending on Task we have several attributes for Relevance Calculation in Evalaution:

Attribute Description Values
mode evaluation mode of algorithm { default, contrastive}
scaler Total / Starting Relevance at the Last Layer Integer ( Default: None, Preferred: 1)
thresholding Thresholding Model Prediction in Segemntation Task to select Pixels predicting the actual class. (Only works in Segmentation Tasks) Default:0.5
task The task of the Model { binary-classification, multi-class classification, bbox-regression, binary-segmentation}
model-type Type of the Model {Encoder/ Encoder_Decoder}

Example Notebooks :

Tensorflow-Keras :

Name Task Link
Backtrace Loan Classification Tabular Dataset Binary Classification Colab Link
Backtrace Image FMNIST Dataset Multi-Class Classification Colab Link
Backtrace CUB Bounding Box Regression Image Dataset Single Object Detection Colab Link
Backtrace Next Word Generation Textual Dataset Next Word Generation Colab Link
Backtrace ImDB Sentiment Classification Textual Dataset Sentiment Classification Colab Link
Backtrace Binary Classification Textual Dataset Binary Classification Colab Link
Backtrace Multi-Class NewsGroup20 Classification Textual Dataset Multi-Class Classification Colab Link
Backtrace CVC-ClinicDB Colonoscopy Binary Segmentation Organ Segmentation Colab Link
Backtrace CamVid Road Car Binary Segmentation Binary Segmentation Colab Link
Backtrace Transformer Encoder for Sentiment Analysis Binary Classification Colab Link
Backtrace Transformer Encoder-Decoder Model for Neural Machine Translation Neural Machine Translation Colab Link
Backtrace Transformer Encoder-Decoder Model for Text Summarization Text Summarization Colab Link

Pytorch :

Name Task Link
Backtrace Tabular Dataset Binary Classification Colab Link
Backtrace Image Dataset Multi-Class Classification Colab Link

For more detailed examples and use cases, check out our documentation.

Supported Layers and Future Work :

Tensorflow-Keras:

  • Dense (Fully Connected) Layer
  • Convolutional Layer (Conv2D,Conv1D)
  • Transpose Convolutional Layer (Conv2DTranspose,Conv1DTranspose)
  • Reshape Layer
  • Flatten Layer
  • Global Max Pooling (2D & 1D) Layer
  • Global Average Pooling (2D & 1D) Layer
  • Max Pooling (2D & 1D) Layer
  • Average Pooling (2D & 1D) Layer
  • Concatenate Layer
  • Add Layer
  • Long Short-Term Memory (LSTM) Layer
  • Dropout Layer
  • Embedding Layer
  • TextVectorization Layer
  • Self-Attention Layer
  • Cross-Attention Layer
  • Feed-Forward Layer
  • Pooler Layer
  • Decoder LM (Language Model) Head
  • Other Custom Layers

Pytorch :

(Note: Currently we only Support Binary and Multi-Class Classification in Pytorch, Segmentation and Single Object Detection will be supported in the next release.)

  • Linear (Fully Connected) Layer
  • Convolutional Layer (Conv2D)
  • Reshape Layer
  • Flatten Layer
  • Global Average Pooling 2D Layer (AdaptiveAvgPool2d)
  • Max Pooling 2D Layer (MaxPool2d)
  • Average Pooling 2D Layer (AvgPool2d)
  • Concatenate Layer
  • Add Layer
  • Long Short-Term Memory (LSTM) Layer
  • Dropout Layer
  • Embedding Layer
  • EmbeddingBag Layer
  • 1d Convolution Layer (Conv1d)
  • 1d Pooling Layers (AvgPool1d,MaxPool1d,AdaptiveAvgPool1d,AdaptiveMaxPool1d)
  • Transpose Convolution Layers (ConvTranspose2d,ConvTranspose1d)
  • Global Max Pooling 2D Layer (AdaptiveMaxPool2d)
  • Other Custom Layers

Getting Started

If you are new to Backtrace, head over to our Getting Started Guide to quickly set up and use the module in your projects.

Contributing

We welcome contributions from the community. To contribute, please follow our Contribution Guidelines.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Contact

For any inquiries or support, please contact AryaXAI Support.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dl_backtrace-0.0.19.tar.gz (86.2 kB view hashes)

Uploaded Source

Built Distribution

dl_backtrace-0.0.19-py3-none-any.whl (96.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page