Skip to main content

LLM explains stack trace for all exceptions. It automatically takes your input code and the error and tries to give a solution.

Project description

LLM Exceptions

LLM Exceptions is a Python package that enhances your debugging experience by automatically explaining stack traces for all exceptions. It leverages large language models to provide explanations and potential solutions for errors in your code. This tool is particularly useful for those who are beginners to python and apprehend the countless errors they face.

Features

  • Automatic Error Analysis: Automatically captures stack traces from exceptions and provides detailed explanations and potential solutions.
  • Easy Integration: Simple to set up in Jupyter Notebooks or Google Colab.
  • Explained by LLMs: Uses large language models to generate accurate and helpful explanations.

Installation

You can install the package using pip:

pip install llm-exceptions

Setup

To use the package, you'll need to set the HF_TOKEN environment variable with your Hugging Face API token. This token is required to access the large language models. The default model is meta-llama/Meta-Llama-3-8B-Instruct.

You can set the environment variable in your terminal like this:

export HF_TOKEN='your_hugging_face_token'

Or, set it directly in your Python script or Jupyter Notebook:

import os
os.environ['HF_TOKEN'] = 'your_hugging_face_token'

Usage

To use LLM Exceptions in a Jupyter Notebook or Google Colab, load the extension with the following magic command:

%load_ext llm_exceptions

Once the extension is loaded, simply run your code as usual. If an exception occurs, LLM Exceptions will automatically analyze the stack trace and provide a detailed explanation along with potential solutions.

Example

  1. Load the extension in your notebook:

    %load_ext llm_exceptions
    
  2. Run some code that produces an error:

    def divide(a, b):
        return a / b
    
    divide(5, 0)
    
  3. When the error occurs, LLM Exceptions will provide an explanation:

    ZeroDivisionError: division by zero
    

    LLM Explanation: The error ZeroDivisionError occurs because you are attempting to divide a number by zero, which is mathematically undefined. To fix this error, ensure that the divisor b is not zero before performing the division.

Citations

If you use LLM Exceptions in your research or project, please consider citing it as follows:

@software{llm_exceptions,
author = {Davood Wadi},
title = {LLM Exceptions: Automatic Stack Trace Analysis and Solutions},
year = {2024},
url = {https://github.com/davoodwadi/llm_exceptions},
note = {Version 0.0.4}
}

License

This project is licensed under the Apache-2.0 license. See the LICENSE file for details.

Contributing

Contributions are welcome! Please feel free to submit a Pull Request or open an issue if you encounter any problems or have suggestions for improvements.

Acknowledgements

This package uses large language models provided by Hugging Face. Make sure to sign up on their platform to obtain the necessary API token.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

llm_exceptions-0.0.6.tar.gz (9.8 kB view details)

Uploaded Source

Built Distribution

llm_exceptions-0.0.6-py3-none-any.whl (10.6 kB view details)

Uploaded Python 3

File details

Details for the file llm_exceptions-0.0.6.tar.gz.

File metadata

  • Download URL: llm_exceptions-0.0.6.tar.gz
  • Upload date:
  • Size: 9.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.9.19

File hashes

Hashes for llm_exceptions-0.0.6.tar.gz
Algorithm Hash digest
SHA256 8d9ff6e43086c944c21a12963acd8121a7ab7e24f3b1384441279b3c00d9d5ff
MD5 1bd557a743b919776e7a2c9103d72e52
BLAKE2b-256 556ae142ed312c078db62542ba34826f593bafcb41ca9f66b74fe094df047fdb

See more details on using hashes here.

File details

Details for the file llm_exceptions-0.0.6-py3-none-any.whl.

File metadata

File hashes

Hashes for llm_exceptions-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 ec3d257790ea68724f0ba4b82d183f37fb7fc1e5b8346f33a15240b7964f74ff
MD5 25ac8ada1d5884716f0bf5545a1e58d2
BLAKE2b-256 b00979b5a9f389bde5496042bd3fb1d58b6bd22bf2f6d6e321574ea2b1439f97

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page