Skip to main content

Owlsight is a commandline tool which combines open-source AI models with Python functionality to create a powerful AI assistant.

Project description

Owlsight

Owlsight is a command-line tool that combines Python programming with open-source language models. It offers an interactive interface that allows you to execute Python code, shell commands, and natural language tasks in one unified environment. This tool is ideal for those who want to integrate Python with language model capabilities.

Why owlsight?

Picture this: you are someone who dabbles in Python occasionally or a seasoned Pythonista. You frequently use Generative AI to accelerate your workflow, especially for generating code. But often, this involves a tedious process—copying and pasting code between ChatGPT and your IDE, repeatedly switching contexts.

What if you could eliminate this friction?

Owlsight brings Python development and Generative AI together, streamlining your workflow by integrating them into a single, unified platform. No more toggling between windows, no more manual code transfers. With Owlsight, you get the full power of Python and AI, all in one place—simplifying your process and boosting productivity.

Generate code directly from model prompts and access this code directly from the Python interpreter. Or augment model-prompts with Python expressions. With this functionality, open-source models do not only generate more accurate responses by executing Python code directly, but they can also solve way more complex problems.

Features

  • Interactive CLI: Choose from multiple commands such as Python, shell, and AI model queries.
  • Python Integration: Switch to a Python interpreter and use python expressions in language model queries.
  • Model Flexibility: Supports models in pytorch, ONNX, and GGUF formats.
  • Customizable Configuration: Easily modify model and generation settings.

Installation

You can install Owlsight using pip:

pip install owlsight

By default, only the transformers library is installed for working with language models.

To add GGUF functionality:

pip install owlsight[gguf]

To add ONNX functionality:

pip install owlsight[onnx]

Usage

After installation, launch Owlsight in the terminal by running the following command:

owlsight

This will present you with the mainmenu:

Make a choice:
> how can I assist you?
shell
python
config: main
save
load
clear history
quit

Go to config > model and set a model_id to load a model locally or from https://huggingface.co/

Available Commands

  • How can I assist you: Ask a question or give an instruction.
  • shell : Execute shell commands.
  • python : Enter a Python interpreter.
  • config: main : Modify the main, model or generate configuration settings.
  • save/load : Save or load a configurationfile.
  • clear history : Clear the chathistory and the python interpreter history.
  • quit : Exit the application.

Example Workflow

You can combine Python variables with language models in Owlsight. For example:

python > a = 42
How can I assist you? > How much is {{a}} * 5?
answer -> 210

Additionally, one can also ask a model to write pythoncode and access that in the python interpreter. All defined objects will be saved in the global namespace of the python interpreter for the remainder of the current active session. This is a powerful feature, which allows build-as-you-go for a wide range of tasks.

Example:

How can I assist you? > Can you write a function which reads an excelfile?

-> model writes a function called read_excel

python > excel_data = read_excel("path/to/excel")

Configurations

Owlsight uses a configuration file in JSON-format to adjust various parameters. Here is an example of what the configuration might look like:

{
    "main": {
        "max_retries_on_error": 5,
        "prompt_code_execution": true,
        "extra_index_url": ""
    },
    "model": {
        "model_id": "Orenguteng/Llama-3.1-8B-Lexi-Uncensored-V2-GGUF",
        "save_history": true,
        "system_prompt": "# ROLE:\nYou are an advanced problem-solving AI with expert-level knowledge in various programming languages, particularly Python.\n\n# TASK:\n- Prioritize Python solutions when appropriate.\n- Present code in markdown format.\n- Clearly state when non-Python solutions are necessary.\n- Break down complex problems into manageable steps and think through the solution step-by-step.\n- Adhere to best coding practices, including error handling and consideration of edge cases.\n- Acknowledge any limitations in your solutions.\n- Always aim to provide the best solution to the user's problem, whether it involves Python or not.",
        "transformers__device": null,
        "transformers__quantization_bits": null,
        "gguf__filename": "Llama-3.1-8B-Lexi-Uncensored_V2_Q4.gguf",
        "gguf__verbose": true,
        "gguf__n_ctx": 16384,
        "onnx__tokenizer": "",
        "onnx__verbose": true,
        "onnx__num_threads": 1
    },
    "generate": {
        "stopwords": [],
        "max_new_tokens": 1024,
        "temperature": 0.0,
        "generation_kwargs": {}
    }
}

Configurationfiles can be saved and loaded through the mainmenu.

Changing configurations

To update a configuration, simply modify the desired value and press ENTER to confirm the change. Please note that only one configuration setting can be updated at a time, and the change will only take effect once ENTER has been pressed.

Temporary environment

During an Owlsight session, a temporary environment is created within the "site-packages" directory of the active (virtual) environment. Any packages installed during the session are removed when the session ends, ensuring your environment remains clean. If you want to persist installed packages, simply install them outside of Owlsight.

Error Handling and Auto-Fix

Owlsight automatically tries to fix and retry any code that encounters a ModuleNotFoundError by installing the required package and re-executing the code. It can also attempt to fix errors in its own generated code. This feature can be controlled by the max_retries_on_error parameter in the configuration file.

RELEASE NOTES

1.0.2

  • Enhanced cross-platform compatibility.
  • Introduced the generate_stream method to all TextGenerationProcessor classes.
  • Various minor bug fixes.
  • Enabled modular imports of individual components from the owlsight library, allowing direct usage of specific functionalities in Python scripts and applications.

1.1.0

  • Added Retrieval Augmented Generation (RAG) for enriching prompts with documentation from python libraries. This option is also added to the configuration.
  • History with autocompletion is now also available when writing prompts. Prompts can be autocompleted with TAB.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

owlsight-1.1.0.tar.gz (521.4 kB view details)

Uploaded Source

Built Distribution

owlsight-1.1.0-py3-none-any.whl (49.2 kB view details)

Uploaded Python 3

File details

Details for the file owlsight-1.1.0.tar.gz.

File metadata

  • Download URL: owlsight-1.1.0.tar.gz
  • Upload date:
  • Size: 521.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.3

File hashes

Hashes for owlsight-1.1.0.tar.gz
Algorithm Hash digest
SHA256 66e929e3ac7c05fa48b1587700be8b89cbc37c83fa8c66fd12094265f1729b9e
MD5 ef9b894161b097d76fe101ac539b13f9
BLAKE2b-256 ba8f01b3042fc9e647c792866ed5aa63653b2fee11d07b398e061a122ff488d2

See more details on using hashes here.

File details

Details for the file owlsight-1.1.0-py3-none-any.whl.

File metadata

  • Download URL: owlsight-1.1.0-py3-none-any.whl
  • Upload date:
  • Size: 49.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.3

File hashes

Hashes for owlsight-1.1.0-py3-none-any.whl
Algorithm Hash digest
SHA256 7ed5fb0d851f649f1ebdfc9a2fa78627bc776f00543ab150a2baf38cdb2fba1d
MD5 7a96c0b2588ee696c144744a4b9a9dd5
BLAKE2b-256 8171156f229f53f551b68ff0b6921b52ac5b8a65c3ff7648a2ce53955ba04761

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page