Skip to main content

Command line interface for interacting with the Humio API using the humioapi library

Project description

Do things with the Humio API from the command line

This project requires Python>=3.6.1

This is a companion CLI to the unofficial humioapi library. It lets you use most of its features easily from the command line. If you're looking for the official CLI it can be found here: humiolib.

Installation

python3 -m pip install humiocli
# or even better
pipx install humiocli

Main features

  • Streaming searches with several output formats
  • Subsearches (pipe output from one search into a new search)
  • Defaults configured through ENV variables (precedence: shell options > shell environment > config-file)
  • Splunk-like chainable relative time modifiers
  • Switch easily from browser to CLI by passing the search URL to urlsearch
  • Ingest data to Humio (but you should use Filebeat for serious things)
  • List repositories

First time setup

Start the guided setup wizard to configure your environment

hc wizard

This will help you create an environment file with a default Humio URL and token, so you don't have to explicitly provide them as options later.

All options may be provided by environment variables on the format HUMIO_<OPTION>=<VALUE>. If a .env file exists in ~/.config/humio/.env it will be automatically sourced on execution without overwriting the existing environment.

Examples

Execute a search in all repos starting with reponame and output @rawstrings

hc search --repo 'reponame*' '#type=accesslog statuscode>=400'

Execute a search using results with fields from another search ("subsearch")

Step 1: Set the output format to or-fields

hc search --repo=auth 'username | select([session_id, app_name])' --outformat=or-fields | jq '.'

This gives a JSON-structure with prepared search strings from all field-value combinations. The special field SUBSEARCH combines all search strings for all fields.

Example output:

{
  "session_id": "\"session_id\"=\"5CF4A111\" or \"session_id\"=\"14C8BCEA\"",
  "app_name": "\"app_name\"=\"frontend\"",
  "SUBSEARCH": "(\"session_id\"=\"5CF4A111\" or \"session_id\"=\"14C8BCEA\") and (\"app_name\"=\"frontend\")"
}

Step 2: Pipe this result to a new search and reference the desired fields:

hc search --repo=auth 'username | select([session_id, app_name])' --outformat=or-fields | hc --repo=frontend '#type=accesslog {{session_id}}'

Output aggregated results as ND-JSON events

Simple example:

Humios bucketing currently creates partial buckets in both ends depending on search period. You may want to provide a rounded start and stop to ensure we only get whole buckets.

hc search --repo 'sandbox' --start=-60m@m --stop=@m "#type=accesslog | timechart(span=1m, series=statuscode)"

Or with a longer multiline search

hc search --repo 'sandbox' --start -60m@m --stop=@m  "$(cat << EOF
#type=accesslog
| case {
    statuscode<=400 | status_ok := 1 ;
    statuscode=4*  | status_client_error := "client_error" ;
    statuscode=5*  | status_server_error := "server_error" ;
    * | status_broken := 1
}
| bucket(limit=50, function=[count(as="count"), count(field=status_ok, as="ok"), count(field=status_client_error, as="client_error"), count(field=status_server_error, as="server_error")])
| error_percentage := (((client_error + server_error) / count) * 100)
EOF
)"

Upload a parser file to the destination repository, overwriting any existing parser

hc makeparser --repo='sandbox' customjson

Ingest a single-line log file with an ingest-token associated with a parser

hc ingest customjson

Ingest a multi-line file with a user provided record separator (markdown headers) and parser

hc ingest README.md --separator '^#' --fields '{"#repo":"sandbox", "#type":"markdown", "@host":"localhost"}'

Development

To install the cli and api packages in editable mode:

git clone https://github.com/gwtwod/humiocli.git
poetry install

Create self-contained executables for easy distribution

This uses Shiv to create a zipapp. A single self-contained file with all python dependencies and a shebang.

On first run, this will unpack the required modues to ~/.shiv/hc/ which will cause a short delay in startup. Subsequent runs should be fast however. The location can be controlled with the env variable SHIV_ROOT. You should probably clean this directory once in a while, since a new one is created every time the distributable changes.

pip install shiv
shiv -c hc -o hc humiocli -p "/usr/bin/env python3"

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

humiocli-0.9.0.tar.gz (17.8 kB view details)

Uploaded Source

Built Distribution

humiocli-0.9.0-py3-none-any.whl (16.8 kB view details)

Uploaded Python 3

File details

Details for the file humiocli-0.9.0.tar.gz.

File metadata

  • Download URL: humiocli-0.9.0.tar.gz
  • Upload date:
  • Size: 17.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.15 CPython/3.6.8 Linux/4.18.0-372.16.1.el8_6.x86_64

File hashes

Hashes for humiocli-0.9.0.tar.gz
Algorithm Hash digest
SHA256 da30255f94170faf8431ec2b6ad40feb726b36ac0a181630b1ec33dd14e56a0d
MD5 efa6fc6697072f8a358d0189f4a250eb
BLAKE2b-256 7f19e76dd17c33075be11dce9fe0f730e0f349f507e541375a9bc66449cacac3

See more details on using hashes here.

File details

Details for the file humiocli-0.9.0-py3-none-any.whl.

File metadata

  • Download URL: humiocli-0.9.0-py3-none-any.whl
  • Upload date:
  • Size: 16.8 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.15 CPython/3.6.8 Linux/4.18.0-372.16.1.el8_6.x86_64

File hashes

Hashes for humiocli-0.9.0-py3-none-any.whl
Algorithm Hash digest
SHA256 bc584c343bb93f2f3bb3fb32ca36daba8e7e7840d92f08fc743cdbb574e3ef39
MD5 976079669b102ff9a111957c648a30c8
BLAKE2b-256 cec2cc94ac73c1b35c1a4703aece100faaa4d74234d2668d19d24b0427eb9ebe

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page