Blue Shell is a chat shell for local ai service endpoint
Project description
Blue Shell
Blue Shell is a AI Chat Shell for local service. 0.0.1 version support ollama
Installation
pip install blueshell
Usage
At simplest, run
python -m blueshell.shell -m "codellama"
If ollama isn't listening default port, for example 11435. we could pass a url parameter like this:
python -m blueshell.shell -m "codellama" --url http://127.0.0.1:11435
More options could run help:
$ python -m blueshell.shell --help
usage: Blue Shell [-h] [--url URL] [-p PROMPT] [-m MODEL]
[-f {markdown,plain,json}] [-s SYSTEM]
A AI assistant for local ai service
options:
-h, --help show this help message and exit
--url URL
-p PROMPT, --prompt PROMPT
-m MODEL, --model MODEL
-f {markdown,plain,json}, --format {markdown,plain,json}
-s SYSTEM, --system SYSTEM
Powered By Python
You can list all models in ollama:
$ python -m blueshell.list
List has a option is url:
$ python -m blueshell.list --url http://127.0.0.1:11435
What's New
0.0.1
support ollama
0.0.2
document typo
0.0.3
print feedback as markdown
0.0.4
fixed dependencies miss
0.0.5
- add list command
- add format argument
- C-c interrupt repl and continue
- Improved User Experience
0.0.6
support json format pretty
Project details
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
blueshell-0.0.6.tar.gz
(4.8 kB
view hashes)
Built Distribution
Close
Hashes for blueshell-0.0.6-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 056b685a424e6469fd36657e3dc93f42a7b5949c788b097253bca7a3c57c5da6 |
|
MD5 | 4351b2bad55309c5bb1924b7e8c11f39 |
|
BLAKE2b-256 | 1bac3a10960e4b51674cd2470c2b8e673702066f1c5cacddcaf58317cd38803f |