Skip to main content

HCLI hai is a python package wrapper that contains an HCLI sample application (hai); hai is an HCLI for interacting with Anthropic's Claude or OpenAI's GPT AI models via terminal input and output streams.

Project description

pypi build status pyver

HCLI hai

HCLI hai is a python package wrapper that contains an HCLI sample application (hai); hai is an HCLI for interacting with Anthropic’s Claude via terminal input and output streams.


HCLI hai wraps hai (an HCLI) and is intended to be used with an HCLI Client [1] as presented via an HCLI Connector [2].

You can find out more about HCLI on hcli.io [3]

[1] https://github.com/cometaj2/huckle

[2] https://github.com/cometaj2/hcli_core

[3] http://hcli.io

Installation

HCLI hai requires a supported version of Python and pip.

You’ll need an HCLI Connector to run hai. For example, you can use HCLI Core (https://github.com/cometaj2/hcli_core), a WSGI server such as Green Unicorn (https://gunicorn.org/), and an HCLI Client like Huckle (https://github.com/cometaj2/huckle).

pip install hcli-hai
pip install hcli-core
pip install huckle
pip install gunicorn
gunicorn --workers=1 --threads=1 -b 127.0.0.1:8000 "hcli_core:connector(\"`hcli_hai path`\")"

Usage

Open a different shell window.

Setup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. hai).

Note that no CLI is actually installed by Huckle. Huckle reads the HCLI semantics exposed by the API via HCLI Connector and ends up behaving like the CLI it targets.

huckle cli install http://127.0.0.1:8000
eval $(huckle env)
hai help

Versioning

This project makes use of semantic versioning (http://semver.org) and may make use of the “devx”, “prealphax”, “alphax” “betax”, and “rcx” extensions where x is a number (e.g. 0.3.0-prealpha1) on github.

Supports

  • Chatting via input/output streams (e.g. via pipes).

  • .hai folder structure in a users’s home directory to help track hai configuration and contexts.

  • Creating, listing, deleting and changing conversation contexts.

  • Behavior setting to allow for persistent chatbot behavior (e.g. the Do Anything Now (DAN) prompt).

To Do

  • A memory layer for the the AI HCLI (hai).
    • Automatic context switching per NLP on received input stream.

    • Context blending to mary different contexts.

    • Automatic context compression to yield a more substantial memory footprint per context window.

  • A shell mode for the AI HCLI (hai) to enable shell CLI execution per sought goal.

Bugs

N/A

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hcli_hai-1.5.2.tar.gz (33.1 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

hcli_hai-1.5.2-py2.py3-none-any.whl (17.0 kB view details)

Uploaded Python 2Python 3

File details

Details for the file hcli_hai-1.5.2.tar.gz.

File metadata

  • Download URL: hcli_hai-1.5.2.tar.gz
  • Upload date:
  • Size: 33.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for hcli_hai-1.5.2.tar.gz
Algorithm Hash digest
SHA256 a51b5230bcf7169211e89a6239b92ac4c8779a0f405ee8993971a23dbcca23e0
MD5 bb7357870167ec7ed9086b77cb413e5c
BLAKE2b-256 0c273406f3114354c3c5b40293471d59b3a82530d8f80775daeb92ad5eb435be

See more details on using hashes here.

File details

Details for the file hcli_hai-1.5.2-py2.py3-none-any.whl.

File metadata

  • Download URL: hcli_hai-1.5.2-py2.py3-none-any.whl
  • Upload date:
  • Size: 17.0 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.1 CPython/3.11.10

File hashes

Hashes for hcli_hai-1.5.2-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 08b6afbe61c6c32b6b5e898a6cc18a09876dc2b351b9610112375e86c7e3f2de
MD5 0e300fb59d3362589f228d48a7381282
BLAKE2b-256 1153a544ae2f49070d18dbfd6eea374927f397227426eac03e32ae4bb93a5af8

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page