Streams Explorer.
Project description
Streams Explorer
Explore Data Pipelines in Apache Kafka.
Contents
Features
- Visualization of streaming applications and topics
- Monitor all or individual pipelines from multiple namespaces
- Inspection of Avro schema from schema registry
- Integration with streams-bootstrap and faust-bootstrap for deploying Kafka Streams applications
- Real-time metrics from Prometheus (consumer lag, topic size, messages in/out per second)
- Linking to external services for logging and analysis, such as Kibana, Grafana, AKHQ, Elasticsearch
- Customizable through Python plugins
Installation
Docker Compose
- Forward the ports to Kafka Connect, Schema Registry, and Prometheus. (other integrations are optional)
- Start the container
docker-compose up
Once the container is started visit http://localhost:3000
Deploying to Kubernetes cluster
- Add the Helm chart repository
helm repo add streams-explorer https://raw.githubusercontent.com/bakdata/streams-explorer/master/helm-chart/
- Install
helm upgrade --install --values helm-chart/values.yaml streams-explorer
Standalone
Backend
- Install dependencies
pip install -r requirements.txt
- Forward the ports to Kafka Connect, Schema Registry, and Prometheus. (other integrations are optional)
- Configure the backend in settings.yaml.
- Start the backend server
uvicorn main:app
Frontend
- Install dependencies
npm install
- Start the frontend server
npm start
Visit http://localhost:3000
Configuration
Depending on your type of installation set the configuration for the backend server in this file:
- Docker Compose: docker-compose.yaml
- Kubernetes: helm-chart/values.yaml
- standalone: backend/settings.yaml
All configuration options can be written as environment variables using underscore notation and the prefix SE
, e.g. SE_K8S__deployment__cluster=false
.
The following configuration options are available:
Kafka Connect
kafkaconnect.url
URL of Kafka Connect server (string, required, default:http://localhost:8083
)kafkaconnect.displayed_information
Configuration options of Kafka connectors displayed in the frontend (list of dict, required, default:[{'name': 'Transformer', 'key': 'transforms.changeTopic.regex'}]
)
Kubernetes
k8s.deployment.cluster
Whether streams-explorer is deployed to Kubernetes cluster (bool, required, default:false
)k8s.deployment.context
Name of cluster (string, optional if running in cluster, default:kubernetes-cluster
)k8s.deployment.namespaces
Kubernetes namespaces (list of string, required, default:['kubernetes-namespace']
)k8s.containers.ignore
Name of containers that should be ignored/hidden (list of string, default:['prometheus-jmx-exporter']
)k8s.displayed_information
Details of pod that should be displayed (list of dict, default:[{'name': 'Labels', 'key': 'metadata.labels'}]
)k8s.labels
Labels used to set attributes of nodes (list of string, required, default:['pipeline']
)k8s.independent_graph.label
Attribute of nodes the pipeline name should be extracted from (string, required, default:pipeline
)k8s.consumer_group_annotation
Annotation the consumer group name should be extracted from (string, required, default:consumerGroup
)
Schema Registry
schemaregistry.url
URL of Schema Registry (string, required, default:http://localhost:8081
)
Prometheus
prometheus.url
URL of Prometheus (string, required, default:http://localhost:9090
)
AKHQ
akhq.url
URL of AKHQ (string, default:http://localhost:8080
)akhq.cluster
Name of cluster (string, default:kubernetes-cluster
)
Grafana
grafana.url
URL of Grafana (string, default:http://localhost:3000
)grafana.dashboard
Path to dashboard (string)
Kibana
kibanalogs.url
URL of Kibana logs (string, default:http://localhost:5601
)
Elasticsearch
for Kafka Connect Elasticsearch connector
esindex.url
URL of Elasticsearch index (string, default:http://localhost:5601/app/kibana#/dev_tools/console
)
Plugins
plugins.path
Path to folder containing plugins relative to backend (string, required, default:./plugins
)plugins.extractors.default
Whether to load default extractors (bool, required, default:true
)
Demo pipeline
ATM Fraud detection with streams-bootstrap
Plugin customization
It is possible to create your own linker and extractors in Python by implementing the LinkingService
or Extractor
classes. This way you can customize it to your specific setup and services. As an example we provide the DefaultLinker and ElasticsearchSink classes which are used by default.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for streams_explorer-1.0.1-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 54c70e90e47bbe58cb7db7625356fd47a67f4414d1e9ae849a4807f63b8d75fd |
|
MD5 | 9199b7e3082c31990ed7a218cd4c159a |
|
BLAKE2b-256 | 86d59e156a9f1ca9fe1b019ced03db10354561cd5146611974744eb8b83f23a1 |