Skip to main content

HCLI hg is a python package wrapper that contains an HCLI sample application (hg); hg is an HCLI for interacting with GPT-3.5-Turbo via terminal input and output streams.

Reason this release was yanked:

deprecating current use and recuperating namespace for a different hcli use

Project description

pypi build status pyver

HCLI hg

HCLI hg is a python package wrapper that contains an HCLI sample application (hg); hg is an HCLI for interacting with GPT-3.5-Turbo via terminal input and output streams.


HCLI hg wraps hg (an HCLI) and is intended to be used with an HCLI Client [1] as presented via an HCLI Connector [2].

You can find out more about HCLI on hcli.io [3]

[1] https://github.com/cometaj2/huckle

[2] https://github.com/cometaj2/hcli_core

[3] http://hcli.io

Installation

HCLI hc requires a supported version of Python and pip.

You’ll need an HCLI Connector to run hc. For example, you can use HCLI Core (https://github.com/cometaj2/hcli_core), a WSGI server such as Green Unicorn (https://gunicorn.org/), and an HCLI Client like Huckle (https://github.com/cometaj2/huckle).

pip install hcli-hg
pip install hcli-core
pip install huckle
pip install gunicorn
gunicorn --workers=1 --threads=1 -b 127.0.0.1:8000 --chdir `hcli_core path` "hcli_core:connector(\"`hcli_hg path`\")"

Usage

Open a different shell window.

Setup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. hc).

Note that no CLI is actually installed by Huckle. Huckle reads the HCLI semantics exposed by the API via HCLI Connector and ends up behaving like the CLI it targets.

huckle cli install http://127.0.0.1:8000
eval $(huckle env)
hg help

Versioning

This project makes use of semantic versioning (http://semver.org) and may make use of the “devx”, “prealphax”, “alphax” “betax”, and “rcx” extensions where x is a number (e.g. 0.3.0-prealpha1) on github.

Supports

  • Chatting by sending command line input streams (e.g. via pipes).

  • Getting and setting a context to setup a new conversation or to save a conversation.

  • Behavior setting to allow for persistent chatbot’s behvior (e.g. the Do Anything Now (DAN) prompt).

To Do

  • A memory layer for the GPT-3.5-Turbo HCLI (hg).
    • Automatic context switching per NLP on received input stream.

    • Context blending to mary different contexts.

    • Automatic context compression to yield a more substantial memory footprint per context window.

  • Additional commands to better save and restore conversations/contexts.

  • A shell mode for the GPT-3.5-Turbo HCLI (hg) to enable shell CLI execution per sought goal.

Bugs

N/A

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hcli_hg-0.1.6.tar.gz (11.2 kB view details)

Uploaded Source

Built Distribution

hcli_hg-0.1.6-py2.py3-none-any.whl (11.4 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file hcli_hg-0.1.6.tar.gz.

File metadata

  • Download URL: hcli_hg-0.1.6.tar.gz
  • Upload date:
  • Size: 11.2 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.9

File hashes

Hashes for hcli_hg-0.1.6.tar.gz
Algorithm Hash digest
SHA256 18b4f7b3b52745d19c36f68006d6323fc3e8e2464b0da6576bcc92fb8f3452c4
MD5 999ea8faa8e25f774cd079baa9614ded
BLAKE2b-256 7c87628d558886332aa77d9d9652bacb289e3000c1dc99f3abda6d5359d5d22a

See more details on using hashes here.

File details

Details for the file hcli_hg-0.1.6-py2.py3-none-any.whl.

File metadata

  • Download URL: hcli_hg-0.1.6-py2.py3-none-any.whl
  • Upload date:
  • Size: 11.4 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.9

File hashes

Hashes for hcli_hg-0.1.6-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 a21be6671696063334b610746762a07cfd23d46d53c44c3c2091a90754b47464
MD5 54e30a0ffd14e93be7509c41cfcd047e
BLAKE2b-256 a8d089e1ced681f4025e56a472489162f36d78deedba831f9a234ccaf3c8695d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page