Skip to main content

HCLI hai is a python package wrapper that contains an HCLI sample application (hai); hai is an HCLI for interacting with Anthropic's Claude or OpenAI's GPT AI models via terminal input and output streams.

Reason this release was yanked:

numpy and python version dependency issues

Project description

pypi build status pyver

HCLI hai

HCLI hai is a python package wrapper that contains an HCLI sample application (hai); hai is an HCLI for interacting with Anthropic’s Claude Sonnet (default) or GPT-3.5-Turbo via terminal input and output streams.


HCLI hai wraps hai (an HCLI) and is intended to be used with an HCLI Client [1] as presented via an HCLI Connector [2].

You can find out more about HCLI on hcli.io [3]

[1] https://github.com/cometaj2/huckle

[2] https://github.com/cometaj2/hcli_core

[3] http://hcli.io

Installation

HCLI hc requires a supported version of Python and pip.

You’ll need an HCLI Connector to run hc. For example, you can use HCLI Core (https://github.com/cometaj2/hcli_core), a WSGI server such as Green Unicorn (https://gunicorn.org/), and an HCLI Client like Huckle (https://github.com/cometaj2/huckle).

pip install hcli-hai
pip install hcli-core
pip install huckle
pip install gunicorn
gunicorn --workers=1 --threads=1 -b 127.0.0.1:8000 "hcli_core:connector(\"`hcli_hai path`\")"

Usage

Open a different shell window.

Setup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. hc).

Note that no CLI is actually installed by Huckle. Huckle reads the HCLI semantics exposed by the API via HCLI Connector and ends up behaving like the CLI it targets.

huckle cli install http://127.0.0.1:8000
eval $(huckle env)
hai help

Versioning

This project makes use of semantic versioning (http://semver.org) and may make use of the “devx”, “prealphax”, “alphax” “betax”, and “rcx” extensions where x is a number (e.g. 0.3.0-prealpha1) on github.

Supports

  • Chatting by sending command line input streams (e.g. via pipes).

  • Getting and setting a context to setup a new conversation or to save a conversation.

  • Behavior setting to allow for persistent chatbot behavior (e.g. the Do Anything Now (DAN) prompt).

To Do

  • A memory layer for the the AI HCLI (hai).
    • Automatic context switching per NLP on received input stream.

    • Context blending to mary different contexts.

    • Automatic context compression to yield a more substantial memory footprint per context window.

  • Additional commands to better save and restore conversations/contexts.

  • A shell mode for the AI HCLI (hai) to enable shell CLI execution per sought goal.

Bugs

N/A

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hcli_hai-1.0.0.tar.gz (14.4 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hcli_hai-1.0.0-py2.py3-none-any.whl (15.5 kB view details)

Uploaded Python 2Python 3

File details

Details for the file hcli_hai-1.0.0.tar.gz.

File metadata

  • Download URL: hcli_hai-1.0.0.tar.gz
  • Upload date:
  • Size: 14.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for hcli_hai-1.0.0.tar.gz
Algorithm Hash digest
SHA256 6b1dc3ab5f015e58e5ba9ce72d2f36484580d608e8ea5b68ebea951291ee918c
MD5 cf1dff7eabd611f072e8e71630b6bea8
BLAKE2b-256 a5ca760f4bdcf2effbade3ded94c0f04e79c7e5213e449b6fc0962b22aebe1b2

See more details on using hashes here.

File details

Details for the file hcli_hai-1.0.0-py2.py3-none-any.whl.

File metadata

  • Download URL: hcli_hai-1.0.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 15.5 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for hcli_hai-1.0.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 6fd8c186abacb911ce4474c3e94574f686c6f203eec5f6ba778c06685cf1a2e5
MD5 e0e3f02517606eef66dd700d4882ce57
BLAKE2b-256 cf9da2e68415f8f5d1e40d297ec2706ec01026fa6a4236ea5439b929cb5a627a

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page