Skip to main content

An HCLI connector that can be used to expose any CLI expressed through hypertext command line interface (HCLI) semantics.

Project description

An HCLI Connector that can be used to expose a REST API with a built-in CLI, via hypertext command line interface (HCLI) semantics.


HCLI Core implements an HCLI Connector, a type of Service Connector, as a WSGI application and provides a way for developers to expose a service hosted CLI, as a REST API, via HCLI semantics. Such an API exposes a “built-in” CLI that can be interacted with dynamically with any HCLI client. Up to date, in-band, man page style API/CLI documentation is readily available for use to help understand how to interact with the API.

Most, if not all, programming languages have a way to issue shell commands. With the help of a generic HCLI client, such as Huckle [1], APIs that make use of HCLI semantics are readily consumable anywhere via the familiar command line (CLI) mode of operation, and this, without there being a need to write a custom and dedicated CLI to interact with a specific HCLI API.

You can find out more about HCLI on hcli.io [2]

The HCLI Internet-Draft [3] is a work in progress by the author and the current implementation leverages hal+json alongside a static form of ALPS (semantic profile) [4] to help enable widespread cross media-type support.

Help shape HCLI and it’s ecosystem by raising issues on github!

[1] https://github.com/cometaj2/huckle

[2] http://hcli.io

[3] https://github.com/cometaj2/I-D/tree/master/hcli

[4] http://alps.io

Installation

hcli_core requires a supported version of Python and pip.

You’ll need an WSGI compliant application server to run hcli_core. For example, you can use Green Unicorn (https://gunicorn.org/)

pip install gunicorn

Install hcli_core via pip. You can launch gunicorn from anywhere by using “hcli_core path”. You can also look at the hcli_core help file.

You can curl your new service to understand what is being exposed. The HCLI root URL, to use with an HCLI client, is the cli link relation.

Install an HCLI client, for example Huckle (https://github.com/cometaj2/huckle), and access the sample CLI (e.g. jsonf or hg) exposed by HCLI Core. You may need to restart your terminal to be able to use the sample CLI by name directly (e.g. jsonf or hg); otherwise you can attempt to source ~/.bash_profile or ~/.bashrc).

Note that no CLI is actually installed by huckle. Huckle reads the HCLI semantics dynamically and ends up behaving like the CLI it targets:

pip install hcli_core

hcli_core help

gunicorn --workers=5 --threads=2 -b 127.0.0.1:8000 --chdir `hcli_core path` "hcli_core:connector()"

curl http://127.0.0.1:8000

pip install huckle

huckle help

huckle cli install http://127.0.0.1:8000

jsonf help

If you want to load a sample HCLI other than the default sample application, you can try loading one of the sample HCLIs, for example, hg HCLI (hypertext GPT-3.5 chatbot).

A folder path to any other 3rd party HCLI can be provided in the same way provided they meet CLI interface (cli.py) and HCLI template (template.json) requirements:

gunicorn --workers=5 --threads=2 --chdir `hcli_core path` "hcli_core:connector(\"`hcli_core sample hg`\")"

huckle cli install http://127.0.0.1:8000

hg help

Versioning

This project makes use of semantic versioning (http://semver.org) and may make use of the “devx”, “prealphax”, “alphax” “betax”, and “rcx” extensions where x is a number (e.g. 0.3.0-prealpha1) on github. Only full major.minor.patch releases will be pushed to pip from now on.

Supports

  • Supports HTTP/HTTPS

  • HCLI version 1.0 server semantics for:

    • hal+json

  • Streaming (application/octet-stream).

  • Supports the Web Server Gateway Interface (WSGI) through PEP 3333 and Falcon. HCLI Core is deployable on any WSGI compliant web server (e.g. gunicorn).

  • Exposing, via HCLI template, any HCLI as a usable client-side shell CLI.

  • Bundled Sample HCLIs:

    • jsonf - a simple formatter for JSON.

    • hg - an HCLI for interacting with GPT-3.5-Turbo via terminal input and output streams.

    • hfm - a file upload and download manager that works with *nix terminal shell input and output streams.

    • hptt - a rudimentary HCLI Push To Talk (PTT) channel management service.

    • hub - a rudimentary HCLI service discovery hub.

    • nw - a flexible IP Address Management (IPAM) service.

  • Support for use of any 3rd party HCLI code that meets CLI interface requirements and HCLI template requirements (i.e. see sample HCLIs).

  • Support large input and output streams as application/octet-stream

To Do

  • Automated tests for all bundled HCLI sample CLIs

  • A memory layer for the GPT-3.5-Turbo HCLI sample CLI

    • Automatic context switching per NLP on received input stream.

    • Context blending to mary different contexts.

    • Automatic context compression to yield a more substantial memory footprint per context window.

  • Separate out HCLI applications from HCLI Core to help avoid application dependencies bleeding onto HCLI Core (e.g. OpenAI).

Bugs

  • No good handling of control over request and response in cli code which can lead to exceptions and empty response client side.

  • The hfm sample HCLI fails disgracefully when copying a remote file name that doesn’t exist (server error).

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

hcli_core-0.11.8.tar.gz (53.3 kB view details)

Uploaded Source

Built Distribution

hcli_core-0.11.8-py2.py3-none-any.whl (70.7 kB view details)

Uploaded Python 2 Python 3

File details

Details for the file hcli_core-0.11.8.tar.gz.

File metadata

  • Download URL: hcli_core-0.11.8.tar.gz
  • Upload date:
  • Size: 53.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.2

File hashes

Hashes for hcli_core-0.11.8.tar.gz
Algorithm Hash digest
SHA256 885c1ffe14f703f96d88210b70e3b2b34f6936ff8b78b171b0968bacb23083f3
MD5 c02dc218d061db2db49c948d276f5a9a
BLAKE2b-256 f6bb7eb3ab6c8ab4f2c475b8679d938aa702420bca6f423c8da7c63da76aa12d

See more details on using hashes here.

File details

Details for the file hcli_core-0.11.8-py2.py3-none-any.whl.

File metadata

  • Download URL: hcli_core-0.11.8-py2.py3-none-any.whl
  • Upload date:
  • Size: 70.7 kB
  • Tags: Python 2, Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.2

File hashes

Hashes for hcli_core-0.11.8-py2.py3-none-any.whl
Algorithm Hash digest
SHA256 31b65ca408ce648409fa46c2a2dfbfac9f0914f4d9ce1497c85aad5d342b22ce
MD5 d7dea619f147cf0050730a4c630822dd
BLAKE2b-256 3244f65c5c7da46c32e04d4860221a70ddd88cc4c2088f1b536151f3722150eb

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page