Python REPL with LLM integration
Project description
pai: A Python REPL with built in LLM support
Features
- Supports ChatGPT and LlamaCpp
- Generate code in the REPL by typing
pai: <prompt>
- REPL history used as context for the LLM prompt
- Review, edit and confirm all generated code prior to execution.
- Code executes on your machine.
Installation & Usage
pip install pai-repl
$ export OPENAI_API_KEY=<api key>
$ pai --chat-gpt gpt-4
Inp [0]>
Examples
Access to REPL history as context
Inp [0]> nums = [1,2,3,4,5,6,7,8]
Inp [2]> ai: calc mean, median and mode. assign each to a var
Gen [2]> from statistics import mean, median, mode
...>
...> mean_val = mean(nums)
...> median_val = median(nums)
...> try:
...> mode_val = mode(nums)
...> except:
...> mode_val = 'No mode'
...>
...> mean_val, median_val, mode_val
Out [2]> (4.5, 4.5, 1)
Inp [3]> mean_val
Out [3]> 4.5
Local filesystem access
Inp [0]> ai: list markdown files in the current directory
Gen [0]> import os
...>
...> # List all markdown files in the current directory
...> markdown_files = [file for file in os.listdir() if file.endswith('.md')]
...>
...> markdown_files
Out [0]> ['README.md']
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
pai_repl-0.1.5.tar.gz
(8.4 kB
view hashes)
Built Distribution
pai_repl-0.1.5-py3-none-any.whl
(10.7 kB
view hashes)