LLM explains stack trace for all exceptions. It automatically takes your input code and the error and tries to give a solution.
Project description
LLM Exceptions
LLM Exceptions is a Python package that enhances your debugging experience by automatically explaining stack traces for all exceptions. It leverages large language models to provide explanations and potential solutions for errors in your code. This tool is particularly useful for those who are beginners to python and apprehend the countless errors they face.
Features
- Automatic Error Analysis: Automatically captures stack traces from exceptions and provides detailed explanations and potential solutions.
- Easy Integration: Simple to set up in Jupyter Notebooks or Google Colab.
- Explained by LLMs: Uses large language models to generate accurate and helpful explanations.
Installation
You can install the package using pip
:
pip install llm-exceptions
Setup
To use the package, you'll need to set the HF_TOKEN
environment variable with your Hugging Face API token. This token is required to access the large language models. The default model is meta-llama/Meta-Llama-3-8B-Instruct
.
You can set the environment variable in your terminal like this:
export HF_TOKEN='your_hugging_face_token'
Or, set it directly in your Python script or Jupyter Notebook:
import os
os.environ['HF_TOKEN'] = 'your_hugging_face_token'
Usage
To use LLM Exceptions in a Jupyter Notebook or Google Colab, load the extension with the following magic command:
%load_ext llm_exceptions
Once the extension is loaded, simply run your code as usual. If an exception occurs, LLM Exceptions will automatically analyze the stack trace and provide a detailed explanation along with potential solutions.
Example
-
Load the extension in your notebook:
%load_ext llm_exceptions
-
Run some code that produces an error:
def divide(a, b): return a / b divide(5, 0)
-
When the error occurs, LLM Exceptions will provide an explanation:
ZeroDivisionError: division by zero
LLM Explanation: The error
ZeroDivisionError
occurs because you are attempting to divide a number by zero, which is mathematically undefined. To fix this error, ensure that the divisorb
is not zero before performing the division.
Citations
If you use LLM Exceptions in your research or project, please consider citing it as follows:
@software{llm_exceptions,
author = {Davood Wadi},
title = {LLM Exceptions: Automatic Stack Trace Analysis and Solutions},
year = {2024},
url = {https://github.com/davoodwadi/llm_exceptions},
note = {Version 0.0.4}
}
License
This project is licensed under the Apache-2.0 license. See the LICENSE file for details.
Contributing
Contributions are welcome! Please feel free to submit a Pull Request or open an issue if you encounter any problems or have suggestions for improvements.
Acknowledgements
This package uses large language models provided by Hugging Face. Make sure to sign up on their platform to obtain the necessary API token.
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for llm_exceptions-0.0.6-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | ec3d257790ea68724f0ba4b82d183f37fb7fc1e5b8346f33a15240b7964f74ff |
|
MD5 | 25ac8ada1d5884716f0bf5545a1e58d2 |
|
BLAKE2b-256 | b00979b5a9f389bde5496042bd3fb1d58b6bd22bf2f6d6e321574ea2b1439f97 |