Skip to main content

Produce your messages from your signal account to kafka.

Project description

signal-kafka-producer a.k.a. signalation

Python package Python Publish PyPI version

Motivation | Installation | Implementation Details

Python package to produce messages from your signal account to kafka by querying the dockerized signal messenger API.

Motivation - Why should I use it?

After starting the signal-kafka-producer all your signal messages (sent and received) are produced onto a kafka topic. As a result, there are two main advantages:

  1. Messages do not get lost, see Keep (Ephemeral) Messages in Message Queue
  2. Use Kafka Connectors or consumers for your use case, see Kafka Consumers/Connectors

Keep (Ephemeral) Messages in Message Queue

We've all been there: You are searching for some information and cannot find the corresponding message. Maybe you've switched your phone, phone number or some of your contacts simply love Signal's ephemeral/self-destructing messages - in any case signal-kafka-producer comes to the rescue. Any message is stored in a kafka topic and YOU are the maintainer of it.

Note: Messages will be deleted in Kafka as well, see Retention Policy. However, you can either set infinite time or use a Kafka DB Sink Connector to store your messages in a database.

Kafka Consumers/Connectors

Having your signal message in a Kafka Topic comes with all kafka associated benefits:

  • Real-time processing: Depending on your use-case, you can write consumers that can act on the messages in real-time, e.g.,
    • A service that sends answers from ChatGPT whenever a message starts with ChatGPT please help:
    • A service that forwards messages to Note to Self whenever a self-destructing message is received.
  • Flexibility: Kafka Topics can be integrated using other tools such as Kafka Connectors, e.g., you could use a Kafka DB Sink Connector to store your messages in a database.

Installation - How can I use it?

For running the signal-kafka-producer, you'll need to have access to a running instance of kafka and signal. If you do not have that go to Complete Installation including Dockerized Services, otherwise you can directly use Pip Installation.

Pip Installation

pip install signalation

The producer can then be executed via

signal-kafka-producer --env_file_path .env

where the .env file should have the following content

ATTACHMENT_FOLDER_PATH=<folder path in which attachments shall be stored>
# Signal configuration
SIGNAL__REGISTERED_NUMBER=<your phone number>
SIGNAL__IP_ADRESS=<ip address of signal rest api>
SIGNAL__PORT=<port of signal rest api>
SIGNAL__TIMEOUT_IN_S=<signal request timeout in seconds>
SIGNAL__RECEIVE_IN_S=<signal request interval in seconds>
# Kafka configuration
KAFKA__SERVER__PORT=<kafka bootstrap server port>

Complete Installation including Dockerized Services

  1. Clone Repository, Install Python Package and Create Configuration
git clone git@github.com:borea17/signal-kafka-producer.git
cd signal-kafka-producer
pip install .

In order to run the dockerized services (signal messenger and kafka) as well as the producer service, you need to create a .env file with the following content

ATTACHMENT_FOLDER_PATH=./attachments
# Signal configuration
SIGNAL__REGISTERED_NUMBER=+49000000
SIGNAL__IP_ADRESS=127.0.0.1
SIGNAL__PORT=8080
SIGNAL__TIMEOUT_IN_S=90
SIGNAL__RECEIVE_IN_S=1
# Kafka configuration
KAFKA__UI__PORT=8081
KAFKA__SERVER__PORT=9092
KAFKA__ZOOKEEPER__PORT=2181

Note: You'll need to replace SIGNAL__REGISTERED_NUMBER with your phone number. Of course, you are free to adjust ports and timeouts / waiting times to your needs.

  1. Run Dockerized Services

In order to run the dockerized signal messenger and a dockerized kafka (using your previously defined variables), you simply need to run

docker-compose -f tests/dockerized_services/signal/docker-compose.yml --env-file .env up -d
docker-compose -f tests/dockerized_services/kafka/docker-compose.yml --env-file .env up -d

Note: Adjust paths accordingly.

  1. Start Producer via CLI
signal-kafka-producer --env_file_path .env

Note: You'll need to register your phone number with for the dockerized signal messenger. Simply follow the instructions in the terminal.

You should see your produced messages on the kafka ui http://localhost:8081/ (use KAFKA__UI__PORT from .env file).

Implementation Details - How does it work?

signal-kafka-producer calls src/signalation/services/producer.py which has the following logic:

  1. It pools the Signal server and retrieves new Signal messages with their metadata
  2. It produces the messages to a Kafka topic.
  3. If a message has an attachment, it downloads it and stores the file locally. Additionally, corresponding metadata of the attachment is produced to a separate Kafka topic.

Here is a more detailed description given by ChatGPT:

The run function is the main entry point of the code. It retrieves configuration settings from an environment file and initializes a Kafka producer object. Then, it repeatedly calls the run_message_loop_iteration function, which retrieves Signal messages and their attachments, produces them to the relevant Kafka topics, and sleeps for a specified duration before it starts the next iteration.

The receive_messages function sends a GET request to the Signal server to retrieve new messages for a given registered number. It returns a list of Signal messages, each represented by a SignalMessage object. The receive_attachments function sends a GET request to the Signal server to retrieve attachments associated with a given list of messages. It returns a list of AttachmentFile objects, each representing a Signal attachment file.

The produce_messages function takes a list of SignalMessage objects and a Kafka producer object, and produces them to the message topic.

Finally, the EnhancedEncoder class is a custom JSON encoder that converts Python objects into JSON strings.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

signalation-1.0.0.tar.gz (13.9 kB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

signalation-1.0.0-py3-none-any.whl (12.9 kB view details)

Uploaded Python 3

File details

Details for the file signalation-1.0.0.tar.gz.

File metadata

  • Download URL: signalation-1.0.0.tar.gz
  • Upload date:
  • Size: 13.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for signalation-1.0.0.tar.gz
Algorithm Hash digest
SHA256 a8c1c0c54e5e302e25df74af6fd0c1f64b117574908b3a3df8709b75cd0c3bbf
MD5 1927932a1900dab8501202e37f48cf61
BLAKE2b-256 68f3743eadb217735fc562b16b0d69015d2938884f64ac672ec9e6dbe92dc483

See more details on using hashes here.

File details

Details for the file signalation-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: signalation-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 12.9 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.9.16

File hashes

Hashes for signalation-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 0467f9d0d4a00c3d2d1b1a009f08859e8ed3e2f3a357e5ee263fcfe068733d64
MD5 569191a5141061a40990e09952b6283f
BLAKE2b-256 9bf76adadaa41d718783dfbea237b9c5088b36cf9fc425761e94737d5c7ab114

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page