Skip to main content

Command line interface for interacting with the Humio API using the humioapi library

Project description

Do things with the Humio API from the command line

This project requires Python>=3.6.1

This is a companion CLI to the unofficial humioapi library. It lets you use most of its features easily from the command line. If you're looking for the official CLI it can be found here: humiolib.

Installation

python3 -m pip install humiocli
# or even better
pipx install humiocli

Main features

  • Streaming searches with several output formats
  • Subsearches (pipe output from one search into a new search)
  • Defaults configured through ENV variables (precedence: shell options > shell environment > config-file)
  • Splunk-like chainable relative time modifiers
  • Switch easily from browser to CLI by passing the search URL to urlsearch
  • Ingest data to Humio (but you should use Filebeat for serious things)
  • List repositories

First time setup

Start the guided setup wizard to configure your environment

hc wizard

This will help you create an environment file with a default Humio URL and token, so you don't have to explicitly provide them as options later.

All options may be provided by environment variables on the format HUMIO_<OPTION>=<VALUE>. If a .env file exists in ~/.config/humio/.env it will be automatically sourced on execution without overwriting the existing environment.

Examples

Execute a search in all repos starting with reponame and output @rawstrings

hc search --repo 'reponame*' '#type=accesslog statuscode>=400'

Execute a search using results with fields from another search ("subsearch")

Step 1: Set the output format to or-fields

hc search --repo=auth 'username | select([session_id, app_name])' --outformat=or-fields | jq '.'

This gives a JSON-structure with prepared search strings from all field-value combinations. The special field SUBSEARCH combines all search strings for all fields.

Example output:

{
  "session_id": "\"session_id\"=\"5CF4A111\" or \"session_id\"=\"14C8BCEA\"",
  "app_name": "\"app_name\"=\"frontend\"",
  "SUBSEARCH": "(\"session_id\"=\"5CF4A111\" or \"session_id\"=\"14C8BCEA\") and (\"app_name\"=\"frontend\")"
}

Step 2: Pipe this result to a new search and reference the desired fields:

hc search --repo=auth 'username | select([session_id, app_name])' --outformat=or-fields | hc --repo=frontend '#type=accesslog {{session_id}}'

Output aggregated results as ND-JSON events

Simple example:

Humios bucketing currently creates partial buckets in both ends depending on search period. You may want to provide a rounded start and stop to ensure we only get whole buckets.

hc search --repo 'sandbox*' --start=-60m@m --stop=@m "#type=accesslog | timechart(span=1m, series=statuscode)"

Or with a longer multiline search

hc search --repo 'sandbox*' --start -60m@m --stop=@m  "$(cat << EOF
#type=accesslog
| case {
    statuscode<=400 | status_ok := 1 ;
    statuscode=4*  | status_client_error := "client_error" ;
    statuscode=5*  | status_server_error := "server_error" ;
    * | status_broken := 1
}
| bucket(limit=50, function=[count(as="count"), count(field=status_ok, as="ok"), count(field=status_client_error, as="client_error"), count(field=status_server_error, as="server_error")])
| error_percentage := (((client_error + server_error) / count) * 100)
EOF
)"

Upload a parser file to the destination repository, overwriting any existing parser

hc makeparser --repo='sandbox*' customjson

Ingest a single-line log file with an ingest-token associated with a parser

hc ingest customjson

Ingest a multi-line file with a user provided record separator (markdown headers) and parser

hc ingest README.md --separator '^#' --fields '{"#repo":"sandbox", "#type":"markdown", "@host":"localhost"}'

Development

To install the cli and api packages in editable mode:

git clone https://github.com/gwtwod/humiocli.git
poetry install

Create self-contained executables for easy distribution

This uses Shiv to create a zipapp. A single self-contained file with all python dependencies and a shebang.

On first run, this will unpack the required modues to ~/.shiv/hc/ which will cause a short delay in startup. Subsequent runs should be fast however. The location can be controlled with the env variable SHIV_ROOT. You should probably clean this directory once in a while, since a new one is created every time the distributable changes.

pip install shiv
shiv -c hc -o hc humiocli -p "/usr/bin/env python3"

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

humiocli-0.8.0.tar.gz (17.3 kB view details)

Uploaded Source

Built Distribution

humiocli-0.8.0-py3-none-any.whl (16.2 kB view details)

Uploaded Python 3

File details

Details for the file humiocli-0.8.0.tar.gz.

File metadata

  • Download URL: humiocli-0.8.0.tar.gz
  • Upload date:
  • Size: 17.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.6.8 Linux/3.10.0-1160.11.1.el7.x86_64

File hashes

Hashes for humiocli-0.8.0.tar.gz
Algorithm Hash digest
SHA256 6e6af4a4ed46de056c694806833df38baa509ae77d12326f58d7987421afab5a
MD5 b8a273e8dca8f2ab0acf673aa29c8f95
BLAKE2b-256 483da964889572d243f92379611e64313bc4e126c00f7b586f59178ab82e652a

See more details on using hashes here.

File details

Details for the file humiocli-0.8.0-py3-none-any.whl.

File metadata

  • Download URL: humiocli-0.8.0-py3-none-any.whl
  • Upload date:
  • Size: 16.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: poetry/1.1.4 CPython/3.6.8 Linux/3.10.0-1160.11.1.el7.x86_64

File hashes

Hashes for humiocli-0.8.0-py3-none-any.whl
Algorithm Hash digest
SHA256 8c107af557d4e10cf549b7d3756005280371b89b4908cd9b737cdd796b0ca681
MD5 c753a65a8fab2f44d5a6b8a9f88c8f1b
BLAKE2b-256 a2493a5338bd3b0a2ec79ddd9ecea96bcda097b297c6a150cbfb48a85e7b5806

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page