kaskade is a terminal user interface for kafka
Project description
Kaskade
Kaskade is a text user interface (TUI) for Apache Kafka, built with Textual. It includes features like:
- Admin:
- List topics, partitions, groups and group members.
- Topic information like lag, replicas and records count.
- Create, edit and delete topics.
- Filter topics by name.
- Consumer:
- Json, string, integer, long, float, boolean and double deserialization.
- Filter by key, value, header and/or partition.
- Schema Registry support with avro.
Screenshots
Installation
Install with pipx
:
pipx install kaskade
pipx
will installkaskade
andkskd
aliases.
Upgrade with pipx
:
pipx upgrade kaskade
How to install pipx for your OS at: pipx Installation.
Running kaskade
Help:
kaskade --help
kaskade admin --help
kaskade consumer --help
Admin view:
kaskade admin -b localhost:9092
Consumer view:
kaskade consumer -b localhost:9092 -t my-topic
Running with docker:
docker run --rm -it --network my-networtk sauljabin/kaskade:latest admin -b my-kafka:9092
docker run --rm -it --network my-networtk sauljabin/kaskade:latest consumer -b my-kafka:9092 -t my-topic
Configuration examples
Multiple bootstrap servers:
kaskade admin -b broker1:9092,broker2:9092
Consume and deserialize:
kaskade consumer -b localhost:9092 -t my-topic -k json -v json
Consuming from the beginning:
kaskade consumer -b localhost:9092 -t my-topic -x auto.offset.reset=earliest
Schema registry simple connection and avro deserialization:
kaskade consumer -b localhost:9092 -s url=http://localhost:8081 -t my-topic -k avro -v avro
More Schema Registry configurations at: SchemaRegistryClient.
librdkafka clients do not currently support AVRO Unions in (de)serialization, more at: Limitations for librdkafka clients.
SSL encryption example:
kaskade admin -b ${BOOTSTRAP_SERVERS} -x security.protocol=SSL
For more information about SSL encryption and SSL authentication go to the
librdkafka
official page: Configure librdkafka client.
Confluent cloud admin:
kaskade admin -b ${BOOTSTRAP_SERVERS} \
-x security.protocol=SASL_SSL \
-x sasl.mechanism=PLAIN \
-x sasl.username=${CLUSTER_API_KEY} \
-x sasl.password=${CLUSTER_API_SECRET}
Confluent cloud consumer:
kaskade consumer -b ${BOOTSTRAP_SERVERS} \
-x security.protocol=SASL_SSL \
-x sasl.mechanism=PLAIN \
-x sasl.username=${CLUSTER_API_KEY} \
-x sasl.password=${CLUSTER_API_SECRET} \
-s url=${SCHEMA_REGISTRY_URL} \
-s basic.auth.user.info=${SR_API_KEY}:${SR_API_SECRET} \
-t my-topic \
-k string \
-v avro
More about confluent cloud configuration at: Kafka Client Quick Start for Confluent Cloud.
Development
For development instructions see DEVELOPMENT.md.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.