Skip to main content

Streams Explorer.

Project description

Streams Explorer

Explore Data Pipelines in Apache Kafka.

streams-explorer

Contents

Features

  • Visualization of streaming applications, topics, and connectors
  • Monitor all or individual pipelines from multiple namespaces
  • Inspection of Avro schema from schema registry
  • Integration with streams-bootstrap and faust-bootstrap for deploying Kafka Streams applications
  • Real-time metrics from Prometheus (consumer lag & read rate, replicas, topic size, messages in & out per second, connector tasks)
  • Linking to external services for logging and analysis, such as Kibana, Grafana, AKHQ, Elasticsearch
  • Customizable through Python plugins

Overview

Visit our introduction blogpost for a complete overview and demo of Streams Explorer.

Installation

Docker Compose

  1. Forward the ports to Prometheus. (Kafka Connect, Schema Registry, and other integrations are optional)
  2. Start the container
docker-compose up

Once the container is started visit http://localhost:3000

Deploying to Kubernetes cluster

  1. Add the Helm chart repository
helm repo add streams-explorer https://raw.githubusercontent.com/bakdata/streams-explorer/master/helm-chart/
  1. Install
helm upgrade --install --values helm-chart/values.yaml streams-explorer

Standalone

Backend

  1. Install dependencies
pip install -r requirements.txt
  1. Forward the ports to Prometheus. (Kafka Connect, Schema Registry, and other integrations are optional)
  2. Configure the backend in settings.yaml.
  3. Start the backend server
uvicorn main:app

Frontend

  1. Install dependencies
npm ci
  1. Start the frontend server
npm start

Visit http://localhost:3000

Configuration

Depending on your type of installation set the configuration for the backend server in this file:

All configuration options can be written as environment variables using underscore notation and the prefix SE, e.g. SE_K8S__deployment__cluster=false.

The following configuration options are available:

General

  • graph_update_every Update the graph every X seconds (integer, required, default: 300)
  • graph_layout_arguments Arguments passed to graphviz layout (string, required, default: -Grankdir=LR -Gnodesep=0.8 -Gpad=10)

Kafka Connect

  • kafkaconnect.url URL of Kafka Connect server (string, default: None)
  • kafkaconnect.displayed_information Configuration options of Kafka connectors displayed in the frontend (list of dict)

Kubernetes

  • k8s.deployment.cluster Whether streams-explorer is deployed to Kubernetes cluster (bool, required, default: false)
  • k8s.deployment.context Name of cluster (string, optional if running in cluster, default: kubernetes-cluster)
  • k8s.deployment.namespaces Kubernetes namespaces (list of string, required, default: ['kubernetes-namespace'])
  • k8s.containers.ignore Name of containers that should be ignored/hidden (list of string, default: ['prometheus-jmx-exporter'])
  • k8s.displayed_information Details of pod that should be displayed (list of dict, default: [{'name': 'Labels', 'key': 'metadata.labels'}])
  • k8s.labels Labels used to set attributes of nodes (list of string, required, default: ['pipeline'])
  • k8s.pipeline.label Attribute of nodes the pipeline name should be extracted from (string, required, default: pipeline)
  • k8s.consumer_group_annotation Annotation the consumer group name should be extracted from (string, required, default: consumerGroup)

Schema Registry

  • schemaregistry.url URL of Schema Registry (string, default: None)

Prometheus

  • prometheus.url URL of Prometheus (string, required, default: http://localhost:9090)

The following exporters are required to collect Kafka metrics for Prometheus:

AKHQ

  • akhq.url URL of AKHQ (string, default: http://localhost:8080)
  • akhq.cluster Name of cluster (string, default: kubernetes-cluster)

Grafana

  • grafana.url URL of Grafana (string, default: http://localhost:3000)
  • grafana.dashboards.topics Path to topics dashboard (string), sample dashboards for topics and consumer groups are included in the ./grafana subfolder
  • grafana.dashboards.consumergroups Path to consumer groups dashboard (string)

Kibana

  • kibanalogs.url URL of Kibana logs (string, default: http://localhost:5601)

Elasticsearch

for Kafka Connect Elasticsearch connector

  • esindex.url URL of Elasticsearch index (string, default: http://localhost:5601/app/kibana#/dev_tools/console)

Plugins

  • plugins.path Path to folder containing plugins relative to backend (string, required, default: ./plugins)
  • plugins.extractors.default Whether to load default extractors (bool, required, default: true)

Demo pipeline

demo-pipeline

ATM Fraud detection with streams-bootstrap

Plugin customization

It is possible to create your own linker, metric provider, and extractors in Python by implementing the LinkingService, MetricProvider, or Extractor classes. This way you can customize it to your specific setup and services. As an example we provide the DefaultLinker as LinkingService. The default MetricProvider supports Prometheus. Furthermore the following default Extractor plugins are included:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

streams-explorer-1.1.10.tar.gz (30.3 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

streams_explorer-1.1.10-py3-none-any.whl (31.6 kB view details)

Uploaded Python 3

File details

Details for the file streams-explorer-1.1.10.tar.gz.

File metadata

  • Download URL: streams-explorer-1.1.10.tar.gz
  • Upload date:
  • Size: 30.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: python-requests/2.25.1

File hashes

Hashes for streams-explorer-1.1.10.tar.gz
Algorithm Hash digest
SHA256 623740dacfaded769b88f153b1a930659b38f979402c989ceb0f4279d783c278
MD5 5f62f31c6456e678bd5f7e627805c684
BLAKE2b-256 e089f76fee5b08701cdb626ac95f49f641cb68af5f9e164fa4eb9d3fb1b4210e

See more details on using hashes here.

File details

Details for the file streams_explorer-1.1.10-py3-none-any.whl.

File metadata

File hashes

Hashes for streams_explorer-1.1.10-py3-none-any.whl
Algorithm Hash digest
SHA256 b8160ddb64f43e00e0be120e9aac21fa6038b8b39a17c5bf210448881124c58c
MD5 1f939f00d35a509b0999de7d1d79cdd9
BLAKE2b-256 eaf0b2c649e946679879f7cb0b3e2c6f629e5b01176a9118689cdd2069ebed84

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page