Skip to main content

An HCLI connector that can be used to expose any CLI expressed through hypertext command line interface (HCLI) semantics.

Project description

An HCLI Connector that can be used to expose a REST API with a built-in CLI, via hypertext command line interface (HCLI) semantics.


HCLI Core implements an HCLI Connector, a type of Service Connector, as a WSGI application and provides a way for developers to expose a service hosted CLI, as a REST API, via HCLI semantics. Such an API exposes a “built-in” CLI that can be interacted with dynamically with any HCLI client. Up to date, in-band, man page style API/CLI documentation is readily available for use to help understand how to interact with the API.

Most, if not all, programming languages have a way to issue shell commands. With the help of a generic HCLI client, such as Huckle [1], APIs that make use of HCLI semantics are readily consumable anywhere via the familiar command line (CLI) mode of operation, and this, without there being a need to write a custom and dedicated CLI to interact with a specific HCLI API.

You can find out more about HCLI on hcli.io [2]

The HCLI Internet-Draft [3] is a work in progress by the author and the current implementation leverages hal+json alongside a static form of ALPS (semantic profile) [4] to help enable widespread cross media-type support.

Help shape HCLI and it’s ecosystem by raising issues on github!

[1] https://github.com/cometaj2/huckle

[2] http://hcli.io

[3] https://github.com/cometaj2/I-D/tree/master/hcli

[4] http://alps.io

Installation

hcli_core requires a supported version of Python and pip.

You’ll need an WSGI compliant application server to run hcli_core. For example, you can use Green Unicorn (https://gunicorn.org/), and an HCLI client such as Huckle (https://github.com/cometaj2/huckle). The following runs the default jsonf HCLI bundled with HCLI Core.

pip install hcli_core
pip install gunicorn
pip install huckle
gunicorn --workers=5 --threads=2 -b 127.0.0.1:8000 --chdir `hcli_core path` "hcli_core:connector()"

Usage

Open a different shell window.

Setup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. jsonf).

Note that no CLI is actually installed by Huckle. Huckle reads the HCLI semantics exposed by the API and ends up behaving like the CLI it targets.

huckle cli install http://127.0.0.1:8000
eval $(huckle env)
jsonf help

3rd Party HCLI Installation

If you want to load a sample HCLI other than the default sample application, you can try loading one of the other sample HCLIs included with HCLI Core. For example, the hg HCLI (hypertext GPT-3.5-Turbo chatbot).

A folder path to any other 3rd party HCLI can be provided in the same way to the HCLI Connector, provided the 3rd party HCLI meets CLI interface (cli.py) and HCLI template (template.json) requirements:

pip install hcli_core
pip install gunicorn
pip install huckle
gunicorn --workers=5 --threads=2 --chdir `hcli_core path` "hcli_core:connector(\"`hcli_core sample hg`\")"

3rd Party HCLI Usage

Open a different shell window.

Setup the huckle env eval in your .bash_profile (or other bash configuration) to avoid having to execute eval everytime you want to invoke HCLIs by name (e.g. hg).

huckle cli install http://127.0.0.1:8000
eval $(huckle env)
hg help

Versioning

This project makes use of semantic versioning (http://semver.org) and may make use of the “devx”, “prealphax”, “alphax” “betax”, and “rcx” extensions where x is a number (e.g. 0.3.0-prealpha1) on github. Only full major.minor.patch releases will be pushed to pip from now on.

Supports

  • HTTP/HTTPS.

  • HCLI version 1.0 server semantics for hal+json

  • Web Server Gateway Interface (WSGI) through PEP 3333 and Falcon.

  • Bundled Sample HCLIs:
    • jsonf - a simple formatter for JSON.

    • hg - an HCLI for interacting with GPT-3.5-Turbo via terminal input and output streams.

    • hfm - a file upload and download manager that works with *nix terminal shell input and output streams.

    • hptt - a rudimentary HCLI Push To Talk (PTT) channel management service.

    • hub - a rudimentary HCLI service discovery hub.

    • nw - a flexible IP Address Management (IPAM) service.

    • hc - a gcode streamer for GRBL compliant controller and a CNC interface (e.g. OpenBuilds BlackBox controller v1.1g and Interface CNC Touch).

  • Support for use of any 3rd party HCLI code that meets CLI interface requirements and HCLI template requirements (i.e. see sample HCLIs).

  • Support large input and output streams as application/octet-stream.

To Do

  • Automated tests for all bundled HCLI samples.

  • A memory layer for the GPT-3.5-Turbo HCLI (hg).
    • Automatic context switching per NLP on received input stream.

    • Context blending to mary different contexts.

    • Automatic context compression to yield a more substantial memory footprint per context window.

  • A shell mode for the GPT-3.5-Turbo HCLI (hg) to enable shell CLI execution per sought goal.

  • Separate out HCLI applications from HCLI Core to help avoid application dependencies bleeding onto HCLI Core (e.g. OpenAI, GRBL, pyserial, etc.).

  • Update GRBL controller HCLI (hc) to include support for additional commands and/or echo of hexadecimal values.

  • Update hc to include job removal and insertion.

  • Update hc to function in a multi-process environment (e.g. multiple workers in gunicorn).

  • Implement GRBL emulation tests for hc.

Bugs

  • No good handling of control over request and response in cli code which can lead to exceptions and empty response client side.

  • The hfm sample HCLI fails disgracefully when copying a remote file name that doesn’t exist (server error).

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hcli_core-0.18.0.tar.gz (69.7 kB view details)

Uploaded Source

Built Distribution

hcli_core-0.18.0-py2.py3-none-any.whl (90.7 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file hcli_core-0.18.0.tar.gz.

File metadata

  • Download URL: hcli_core-0.18.0.tar.gz
  • Upload date:
  • Size: 69.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for hcli_core-0.18.0.tar.gz
Algorithm Hash digest
SHA256 bd99d8ddd1e5f02927f491ee86258985133627097ebc7a021ec40e8c39a54123
MD5 586a1d614723bea9011dbec25dea6dfa
BLAKE2b-256 76f362cf8c35a70478c9ef2232d678f6bc5c6d83cd50b08310013624fa8fadb9

See more details on using hashes here.

File details

Details for the file hcli_core-0.18.0-py2.py3-none-any.whl.

File metadata

  • Download URL: hcli_core-0.18.0-py2.py3-none-any.whl
  • Upload date:
  • Size: 90.7 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.3

File hashes

Hashes for hcli_core-0.18.0-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 32fe22e601a090f2ff8cca7f1990713212e78d7b68038cb0154866263b19254c
MD5 b732d6559b844414b07f9597131bee8e
BLAKE2b-256 0dc5ed11ce35cef710e2b312b5dbc6b4875148e4c63dbab27a2bad2eac859d78

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page