Skip to main content

HCLI hai is a python package wrapper that contains an HCLI sample application (hai); hai is an HCLI for interacting with Anthropic's Claude or OpenAI's GPT AI models via terminal input and output streams.

Project description

pypi build status pyver

HCLI hai

HCLI hai is a python package wrapper that contains an HCLI sample application (hai); hai is an HCLI for interacting with Anthropic’s Claude via terminal input and output streams.


HCLI hai wraps hai (an HCLI) and is intended to be used with an HCLI Client [1] as presented via an HCLI Connector [2].

You can find out more about HCLI on hcli.io [3]

[1] https://github.com/cometaj2/huckle

[2] https://github.com/cometaj2/hcli_core

[3] http://hcli.io

Installation

HCLI hai requires a supported version of Python and pip.

You’ll need an HCLI Connector to run hai. For example, you can use HCLI Core (https://github.com/cometaj2/hcli_core), a WSGI server such as Green Unicorn (https://gunicorn.org/), and an HCLI Client like Huckle (https://github.com/cometaj2/huckle).

pip install hcli-hai
pip install hcli-core
pip install huckle
pip install gunicorn
gunicorn --workers=1 --threads=1 -b 127.0.0.1:8000 "hcli_core:connector(\"`hcli_hai path`\")"

Usage

Open a different shell window.

Setup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. hai).

Note that no CLI is actually installed by Huckle. Huckle reads the HCLI semantics exposed by the API via HCLI Connector and ends up behaving like the CLI it targets.

huckle cli install http://127.0.0.1:8000
eval $(huckle env)
hai help

Versioning

This project makes use of semantic versioning (http://semver.org) and may make use of the “devx”, “prealphax”, “alphax” “betax”, and “rcx” extensions where x is a number (e.g. 0.3.0-prealpha1) on github.

Supports

  • Chatting via input/output streams (e.g. via pipes).

  • .hai folder structure in a users’s home directory to help track hai configuration and contexts.

  • Creating, listing, deleting and changing conversation contexts.

  • Behavior setting to allow for persistent chatbot behavior (e.g. the Do Anything Now (DAN) prompt).

To Do

  • A memory layer for the the AI HCLI (hai).
    • Automatic context switching per NLP on received input stream.

    • Context blending to mary different contexts.

    • Automatic context compression to yield a more substantial memory footprint per context window.

  • A shell mode for the AI HCLI (hai) to enable shell CLI execution per sought goal.

Bugs

N/A

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hcli_hai-2.1.0.tar.gz (40.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hcli_hai-2.1.0-py2.py3-none-any.whl (18.9 kB view details)

Uploaded Python 2Python 3

File details

Details for the file hcli_hai-2.1.0.tar.gz.

File metadata

  • Download URL: hcli_hai-2.1.0.tar.gz
  • Upload date:
  • Size: 40.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for hcli_hai-2.1.0.tar.gz
Algorithm Hash digest
SHA256 5a78db59832e7e4b7c80feb9d80e18a3eb5f3a5ccec7e0c5902dd0f5050eae06
MD5 922ba5c488506da0cb71ab2ab0e422d9
BLAKE2b-256 67082b49c1e5414b10625dec3bf573b160f47eb1fce955b642e7c7cf5f7edf34

See more details on using hashes here.

File details

Details for the file hcli_hai-2.1.0-py2.py3-none-any.whl.

File metadata

  • Download URL: hcli_hai-2.1.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 18.9 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for hcli_hai-2.1.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 6dec34a16e3b57697fbeee1003555daa8a3f49bc153a49264817b7bc9c706716
MD5 a4d1505eb596ab72dcb3702b4b20890a
BLAKE2b-256 f14c9fbe0ec3229462f666e09ef5bbff937abdfab06a4d7af1e34f7ca4b9200c

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page