HCLI hai is a python package wrapper that contains an HCLI sample application (hai); hai is an HCLI for interacting with Anthropic's Claude or OpenAI's GPT AI models via terminal input and output streams.
Project description
HCLI hai
HCLI hai is a python package wrapper that contains an HCLI sample application (hai); hai is an HCLI for interacting with Anthropic’s Claude models via terminal input and output streams.
HCLI hai wraps hai (an HCLI) and is intended to be used with an HCLI Client [1] as presented via an HCLI Connector [2].
You can find out more about HCLI on hcli.io [3]
[1] https://github.com/cometaj2/huckle
[2] https://github.com/cometaj2/hcli_core
[3] http://hcli.io
Installation
HCLI hai requires a supported version of Python and pip.
You’ll need an HCLI Connector to run hai. For example, you can use HCLI Core (https://github.com/cometaj2/hcli_core), a WSGI server such as Green Unicorn (https://gunicorn.org/), and an HCLI Client like Huckle (https://github.com/cometaj2/huckle).
pip install hcli-hai
pip install hcli-core
pip install huckle
pip install gunicorn
gunicorn --workers=1 --threads=1 -b 127.0.0.1:8000 "hcli_core:connector(\"`hcli_hai path`\")"
Usage
Open a different shell window.
Setup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. hai).
Note that no CLI is actually installed by Huckle. Huckle reads the HCLI semantics exposed by the API via HCLI Connector and ends up behaving like the CLI it targets.
huckle cli install http://127.0.0.1:8000
eval $(huckle env)
hai help
Versioning
This project makes use of semantic versioning (http://semver.org) and may make use of the “devx”, “prealphax”, “alphax” “betax”, and “rcx” extensions where x is a number (e.g. 0.3.0-prealpha1) on github.
Supports
Chatting via input/output streams (e.g. via pipes).
.hai folder structure in a users’s home directory to help track hai configuration and contexts.
Creating, listing, deleting and changing conversation contexts.
Automatic title creation based on context
Custom context naming to help organize contexts
Behavior setting to allow for persistent chatbot behavior (e.g. the Do Anything Now (DAN) prompt).
To Do
- A memory layer for the the AI HCLI (hai).
Automatic context switching per NLP on received input stream.
Context blending to mary different contexts.
Automatic context compression to yield a more substantial memory footprint per context window.
A shell mode for the AI HCLI (hai) to enable shell CLI execution per sought goal.
Bugs
N/A
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file hcli_hai-2.1.1.tar.gz.
File metadata
- Download URL: hcli_hai-2.1.1.tar.gz
- Upload date:
- Size: 40.3 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
86d96e129ce8d565f031bd23a719b7b91405083d97278b6668b9c6229c305187
|
|
| MD5 |
05cd819dbc8d51d400cb43e423a35450
|
|
| BLAKE2b-256 |
82baaa1b1943ad5a9583d872800b6510368755c5e5ce830d0b45da7b97d9c82a
|
File details
Details for the file hcli_hai-2.1.1-py2.py3-none-any.whl.
File metadata
- Download URL: hcli_hai-2.1.1-py2.py3-none-any.whl
- Upload date:
- Size: 18.9 kB
- Tags: Python 2, Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.1 CPython/3.11.10
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
46ae2528cbfa22910cc522bb161af9678f8214d50de7ca0dc33e85cc66e26589
|
|
| MD5 |
cdb0fe29d10de59f94f04ab0746ba8ae
|
|
| BLAKE2b-256 |
a162bde0942334f06cbec45810be333d8ec0cbf8e3dc1eccb94429bccfca79e1
|