Skip to main content

Streams Explorer.

Project description

Streams Explorer

Explore Data Pipelines in Apache Kafka.

streams-explorer

Contents

Features

  • Visualization of streaming applications and topics
  • Monitor all or individual pipelines from multiple namespaces
  • Inspection of Avro schema from schema registry
  • Integration with streams-bootstrap and faust-bootstrap for deploying Kafka Streams applications
  • Real-time metrics from Prometheus (consumer lag, topic size, messages in/out per second)
  • Linking to external services for logging and analysis, such as Kibana, Grafana, AKHQ, Elasticsearch
  • Customizable through Python plugins

Installation

Docker Compose

  1. Forward the ports to Kafka Connect, Schema Registry, and Prometheus. (other integrations are optional)
  2. Start the container
docker-compose up

Once the container is started visit http://localhost:3000

Deploying to Kubernetes cluster

  1. Add the Helm chart repository
helm repo add streams-explorer https://raw.githubusercontent.com/bakdata/streams-explorer/master/helm-chart/
  1. Install
helm upgrade --install --values helm-chart/values.yaml streams-explorer

Standalone

Backend

  1. Install dependencies
pip install -r requirements.txt
  1. Forward the ports to Kafka Connect, Schema Registry, and Prometheus. (other integrations are optional)
  2. Configure the backend in settings.yaml.
  3. Start the backend server
uvicorn main:app

Frontend

  1. Install dependencies
npm install
  1. Start the frontend server
npm start

Visit http://localhost:3000

Configuration

Depending on your type of installation set the configuration for the backend server in this file:

All configuration options can be written as environment variables using underscore notation and the prefix SE, e.g. SE_K8S__deployment__cluster=false.

The following configuration options are available:

Kafka Connect

  • kafkaconnect.url URL of Kafka Connect server (string, required, default: http://localhost:8083)
  • kafkaconnect.displayed_information Configuration options of Kafka connectors displayed in the frontend (list of dict, required, default: [{'name': 'Transformer', 'key': 'transforms.changeTopic.regex'}])

Kubernetes

  • k8s.deployment.cluster Whether streams-explorer is deployed to Kubernetes cluster (bool, required, default: false)
  • k8s.deployment.context Name of cluster (string, optional if running in cluster, default: kubernetes-cluster)
  • k8s.deployment.namespaces Kubernetes namespaces (list of string, required, default: ['kubernetes-namespace'])
  • k8s.containers.ignore Name of containers that should be ignored/hidden (list of string, default: ['prometheus-jmx-exporter'])
  • k8s.displayed_information Details of pod that should be displayed (list of dict, default: [{'name': 'Labels', 'key': 'metadata.labels'}])
  • k8s.labels Labels used to set attributes of nodes (list of string, required, default: ['pipeline'])
  • k8s.independent_graph.label Attribute of nodes the pipeline name should be extracted from (string, required, default: pipeline)

Schema Registry

  • schemaregistry.url URL of Schema Registry (string, required, default: http://localhost:8081)

Prometheus

  • prometheus.url URL of Prometheus (string, required, default: http://localhost:9090)

AKHQ

  • akhq.url URL of AKHQ (string, default: http://localhost:8080)
  • akhq.cluster Name of cluster (string, default: kubernetes-cluster)

Grafana

  • grafana.url URL of Grafana (string, default: http://localhost:3000)
  • grafana.dashboard Path to dashboard (string)

Kibana

  • kibanalogs.url URL of Kibana logs (string, default: http://localhost:5601)

Elasticsearch

for Kafka Connect Elasticsearch connector

  • esindex.url URL of Elasticsearch index (string, default: http://localhost:5601/app/kibana#/dev_tools/console)

Plugins

  • plugins.path Path to folder containing plugins relative to backend (string, required, default: ./plugins)
  • plugins.extractors.default Whether to load default extractors (bool, required, default: true)

Demo pipeline

ATM Fraud detection with streams-bootstrap

Plugin customization

It is possible to create your own linker and extractors in Python by implementing the LinkingService or Extractor classes. This way you can customize it to your specific setup and services. As an example we provide the DefaultLinker and ElasticsearchSink classes which are used by default.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

streams-explorer-1.0.0.tar.gz (21.9 kB view hashes)

Uploaded Source

Built Distribution

streams_explorer-1.0.0-py3-none-any.whl (25.1 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page