A client for the Confluent Platform Kafka Connect REST API.
Project description
Kafka Connect Python
The Kafka Connect REST API allows you to manage connectors that move data between Apache Kafka and other systems.
The kafka-connect
or kc
command line tool provides commands for getting information about the Kafka Connect cluster and its connectors, creating new connectors, updating existing connectors, deleting connectors, etc.
This project aims to supported all features of the Kafka Connect REST API.
Install
pip install kafka-connect-py
Command Line Usage
Get the version and other details of the Kafka Connect cluster.
kc info
Get a list of active connectors.
kc list [--expand=status|info]
Get the details of a single connector.
kc get <connector>
Get the status of a connector.
kc status <connector>
Get the config of a connector.
kc config <connector>
Create a new connector.
kc create --config-file <config-file>
# or with inline JSON data
kc create --config-data <config-data>
Update the configuration for an existing connector.
kc update <connector> --config-file <config_file>
# or with inline JSON data
kc create <connector> --config-data <config-data>
Restart a connector.
kc restart <connector> [--include-tasks] [--only-failed]
Pause a connector.
kc pause <connector>
Resume a connector.
kc resume <connector>
Delete a connector.
kc delete <connector>
Python
# Import the class
from kafka_connect import KafkaConnect
# Instantiate the client
client = KafkaConnect(endpoint="http://localhost:8083")
# Get the version and other details of the Kafka Connect cluster
cluster = client.get_info()
print(cluster)
# Get a list of active connectors
connectors = client.get_connectors()
print(connectors)
# Create a new connector
config = {
"name": "my-connector",
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"tasks.max": "1",
"connection.url": "jdbc:postgresql://localhost:5432/mydatabase",
"connection.user": "myuser",
"connection.password": "mypassword",
"table.whitelist": "mytable",
"mode": "timestamp+incrementing",
"timestamp.column.name": "modified_at",
"validate.non.null": "false",
"incrementing.column.name": "id",
"topic.prefix": "my-connector-",
},
}
response = client.create_connector(config)
print(response)
# Update an existing connector
new_config = {
"config": {
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"tasks.max": "1",
"connection.url": "jdbc:postgresql://localhost:5432/mydatabase",
"connection.user": "myuser",
"connection.password": "mypassword",
"table.whitelist": "mytable",
"mode": "timestamp+incrementing",
"timestamp.column.name": "modified_at",
"validate.non.null": "false",
"incrementing.column.name": "id",
"topic.prefix": "my-connector-",
},
}
response = client.update_connector("my-connector", new_config)
print(response)
# Restart a connector
response = client.restart_connector("my-connector")
print(response)
# Delete a connector
response = client.delete_connector("my-connector")
print(response)
Tests
python3 -m unittest tests/test_kafka_connect.py -v
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for kafka_connect_py-0.3.5-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | a332ce9c51f9626b685180cc2e64e12bd99dc603f5072c26e92c486257ac7c85 |
|
MD5 | e8e06815c4c5ce599edea0a150bf9ee3 |
|
BLAKE2b-256 | 9435240f9e0b4c13a5f18727021162eff6172f6085f6b0d31bf1295b394cc428 |