Skip to main content

An HCLI connector that can be used to expose any CLI expressed through hypertext command line interface (HCLI) semantics.

Project description

An HCLI Connector that can be used to expose a REST API with a built-in CLI, via hypertext command line interface (HCLI) semantics.


HCLI Core implements an HCLI Connector, a type of Service Connector, as a WSGI application and provides a way for developers to expose a service hosted CLI, as a REST API, via HCLI semantics. Such an API exposes a “built-in” CLI that can be interacted with dynamically with any HCLI client. Up to date, in-band, man page style API/CLI documentation is readily available for use to help understand how to interact with the API.

Most, if not all, programming languages have a way to issue shell commands. With the help of a generic HCLI client, such as Huckle [1], APIs that make use of HCLI semantics are readily consumable anywhere via the familiar command line (CLI) mode of operation, and this, without there being a need to write a custom and dedicated CLI to interact with a specific HCLI API.

You can find out more about HCLI on hcli.io [2]

The HCLI Internet-Draft [3] is a work in progress by the author and the current implementation leverages hal+json alongside a static form of ALPS (semantic profile) [4] to help enable widespread cross media-type support.

Help shape HCLI and it’s ecosystem by raising issues on github!

[1] https://github.com/cometaj2/huckle

[2] http://hcli.io

[3] https://github.com/cometaj2/I-D/tree/master/hcli

[4] http://alps.io

Installation

hcli_core requires a supported version of Python and pip.

You’ll need an WSGI compliant application server to run hcli_core. For example, you can use Green Unicorn (https://gunicorn.org/), and an HCLI client such as Huckle (https://github.com/cometaj2/huckle). The following runs the default jsonf HCLI bundled with HCLI Core.

pip install hcli_core
pip install gunicorn
pip install huckle
gunicorn --workers=5 --threads=2 -b 127.0.0.1:8000 --chdir `hcli_core path` "hcli_core:connector()"

Usage

Open a different shell window.

Setup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. jsonf).

Note that no CLI is actually installed by Huckle. Huckle reads the HCLI semantics exposed by the API and ends up behaving like the CLI it targets.

huckle cli install http://127.0.0.1:8000
eval $(huckle env)
jsonf help

3rd Party HCLI Installation

If you want to load a sample HCLI other than the default sample application, you can try loading one of the other sample HCLIs included with HCLI Core. For example, the hg HCLI (hypertext GPT-3.5-Turbo chatbot).

A folder path to any other 3rd party HCLI can be provided in the same way to the HCLI Connector, provided the 3rd party HCLI meets CLI interface (cli.py) and HCLI template (template.json) requirements:

pip install hcli_core
pip install gunicorn
pip install huckle
gunicorn --workers=5 --threads=2 --chdir `hcli_core path` "hcli_core:connector(\"`hcli_core sample hg`\")"

3rd Party HCLI Usage

Open a different shell window.

Setup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. hg).

huckle cli install http://127.0.0.1:8000
eval $(huckle env)
hg help

Versioning

This project makes use of semantic versioning (http://semver.org) and may make use of the “devx”, “prealphax”, “alphax” “betax”, and “rcx” extensions where x is a number (e.g. 0.3.0-prealpha1) on github. Only full major.minor.patch releases will be pushed to pip from now on.

Supports

  • HTTP/HTTPS.

  • HCLI version 1.0 server semantics for hal+json

  • Web Server Gateway Interface (WSGI) through PEP 3333 and Falcon.

  • Bundled Sample HCLIs:
    • jsonf - a simple formatter for JSON.

    • hg - an HCLI for interacting with GPT-3.5-Turbo via terminal input and output streams.

    • hfm - a file upload and download manager that works with *nix terminal shell input and output streams.

    • hptt - a rudimentary HCLI Push To Talk (PTT) channel management service.

    • hub - a rudimentary HCLI service discovery hub.

    • nw - a flexible IP Address Management (IPAM) service.

    • hc - a gcode streamer for GRBL compliant controller and a CNC interface (e.g. OpenBuilds BlackBox controller v1.1g and Interface CNC Touch).

  • Support for use of any 3rd party HCLI code that meets CLI interface requirements and HCLI template requirements (i.e. see sample HCLIs).

  • Support large input and output streams as application/octet-stream.

To Do

  • Automated tests for all bundled HCLI samples.

  • A memory layer for the GPT-3.5-Turbo HCLI (hg).
    • Automatic context switching per NLP on received input stream.

    • Context blending to mary different contexts.

    • Automatic context compression to yield a more substantial memory footprint per context window.

  • A shell mode for the GPT-3.5-Turbo HCLI (hg) to enable shell CLI execution per sought goal.

  • Separate out HCLI applications from HCLI Core to help avoid application dependencies bleeding onto HCLI Core (e.g. OpenAI, GRBL, pyserial, etc.).

  • Update GRBL controller HCLI (hc) to include support for additional commands and/or echo of hexadecimal values.

  • Update hc to include job removal and insertion.

  • Update hc to function in a multi-process environment (e.g. multiple workers in gunicorn).

  • Implement GRBL emulation tests for hc.

Bugs

  • No good handling of control over request and response in cli code which can lead to exceptions and empty response client side.

  • The hfm sample HCLI fails disgracefully when copying a remote file name that doesn’t exist (server error).

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hcli_core-0.17.0.tar.gz (69.9 kB view details)

Uploaded Source

Built Distribution

hcli_core-0.17.0-py2.py3-none-any.whl (90.9 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file hcli_core-0.17.0.tar.gz.

File metadata

  • Download URL: hcli_core-0.17.0.tar.gz
  • Upload date:
  • Size: 69.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for hcli_core-0.17.0.tar.gz
Algorithm Hash digest
SHA256 b4f5acf28e29e5d690877ec8dc4dbfb2d2ecff4b08be2614bfb7688879aeb230
MD5 16084e9c008bf5fcd039cc45eaae0cd8
BLAKE2b-256 80033ff65ae1aa72e9497963c2c02932136dd48bbdb9e87ea74d05e46d033098

See more details on using hashes here.

File details

Details for the file hcli_core-0.17.0-py2.py3-none-any.whl.

File metadata

  • Download URL: hcli_core-0.17.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 90.9 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for hcli_core-0.17.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 14d31ea206c94292766616f851833af22f44fa383a16cacd413c8495a2e4e2e6
MD5 bf8d0e361c17a9c55653f07e1a7ca9ee
BLAKE2b-256 0c706c4e262417b1031f022e6f26b3053c306e9b1d72179787c54a9d90db701d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page