Skip to main content

Boilerplate to quickly setup a Django Rest Framework Microservice for T3

Project description

pipeline status coverage report PyPI version

Install

Setup Virtualenv (optional)

python -m venv .venv
source .venv/bin/activate

# There is a bug in pip 9.x  go ahead and upgrade to make sure that you're pip 10.x
pip install --upgrade pip

Install

# Install from pypi
pip install t3-core

# Install in the `src` dir of your python environment
pip install -e git+ssh://git@gitlab.t-3.com:t3-core/t3-core-python.git

# Choose where the clone lives
git clone git@gitlab.t-3.com:t3-core/t3-core-python.git
pip install -e ./t3-python-core

Testing & Linting

Test & Coverage Report

pytest

Lint

pylama

T3 Events

In t3-core, we’ve a sub-module which is an event queue system used to connect microservices together, in an asynchronous manner, with built-in fault tolerance. It uses RabbitMQ as a messaging bus to accomplish this. Event queue system consists of 2 main parts, consumer, and publisher.

A Consumer, as the name suggests, consumes the messages, by invoking a callback upon receiving the specified message, which is published by the Publisher. Separately running processes, commonly known as workers can consume messages, whereas, any process including web process can publish messages. If any process is unable to finish consuming the message, it gets requed in the system and is sent to a different consumer or stored until the consumer is available again.

In T3 Events. T3 Events, is configured by setting up 2 key environment variables, T3_EVENTS, and T3_EVENTS_AMQP_URL, the first one is a boolean value of true or false on whether to use events or not, and the second one is a connection string for rabbit-mq. This env vars can be inserted into any system, and as long as different systems have same value for those env vars, they are part of the same message queue system. There are two main types of consumers and publishers, details below:

Task

Task consumer/publisher are dedicated to consuming or publishing Tasks, which by definition are consumed one at a time, for a given task, which means, the first available consumer will pick up the task and process it, then the next task in the list will go to the next available consumer, and so on. Below is an example of a sample task.

Consumer:

from t3.events.consumers imoprt TaskConsumer

def message_callback(payload):
    print(f'message callback task consumer: payload: {payload}')

# Start consumer
test = TaskConsumer()
test.set_task_name('test_task')
test.set_callback(message_callback)
test.run()

Publisher:

from t3.events.publishers import TaskPublisher

# Use publisher
test = TaskPublisher()
test.set_task_name('test_task')
test.set_message('test message, could be in json too')
test.run()

For the above example, the given example task name is test_task, which is the same for a consumer and a publisher, which connects them together. Since this is a TaskConsumer, and a TaskPublisher, if you run more than 1 Consumer, and Publish several times, it’ll be processed in a round-robin manner with the running Consumers, one at a time. Once 1 message is processed, then the system moves on to the next one, and so on.

Topic

Topic consumer/publisher are dedicated to consuming or publishing Topics, which by definition are broadcasted to all subscribed Consumers, for a given Topic, which means, when a topic is published, all consumers signed up for this topic will receive the message. Below is an example of a sample topic.

Consumer:

from t3.events.consumers import TopicConsumer

def message_callback(payload):
    print(f'message callback topic consumer: payload: {payload}')

# Start consumer
test = TopicConsumer()
test.set_topic_name('test_topic')
test.set_callback(message_callback)
test.run()

Publisher:

from t3.events.publishers import TopicPublisher
import json

# Use publisher
test = TopicPublisher()
test.set_topic_name('test_topic')
test.set_message(json.dumps({'json': 'object'}))
test.run()

For the above example, the given topic name is test_topic, which is the same for a consumer and a publisher, which connects them together. Since this is a TopicConsumer, and a TopicPublisher, if you run more than 1 Consumer, and Publish, it’ll be processed by all running Consumers, at once.

Running T3 Events

T3 events is composed of Consumers, and Publishers.

Consumers must be run in a separate process, as they are independent of anything else that is going on in the system, for local development, you can run python name_of_consumer_file.py, however in production environments, use nohup python name_of_consumer_file.py, for fault tolerance purposes.

Pubishers on the other hand, can be run as part of a process, as their main job is to publish an event to the message queue system. To use a Publisher, you can run the lines below # Use publisher from the above examples, provided the 2 required environment variables are present.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

t3-core-0.11.5.tar.gz (44.7 kB view details)

Uploaded Source

Built Distribution

t3_core-0.11.5-py3-none-any.whl (35.2 kB view details)

Uploaded Python 3

File details

Details for the file t3-core-0.11.5.tar.gz.

File metadata

  • Download URL: t3-core-0.11.5.tar.gz
  • Upload date:
  • Size: 44.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.19.1 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.6.8

File hashes

Hashes for t3-core-0.11.5.tar.gz
Algorithm Hash digest
SHA256 b3d87d705bed2c4b526367eb3b85bf8214bfee9ef7aa1f999a737ec7c7f283cd
MD5 c0f112eafa9001412f4d54bed2b0b08d
BLAKE2b-256 b3846d778fe75dc5b97ac2760211ba6d69c285041cf27bb1f7258bd4755ef7e2

See more details on using hashes here.

File details

Details for the file t3_core-0.11.5-py3-none-any.whl.

File metadata

  • Download URL: t3_core-0.11.5-py3-none-any.whl
  • Upload date:
  • Size: 35.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.13.0 pkginfo/1.5.0.1 requests/2.19.1 setuptools/40.6.2 requests-toolbelt/0.9.1 tqdm/4.32.2 CPython/3.6.8

File hashes

Hashes for t3_core-0.11.5-py3-none-any.whl
Algorithm Hash digest
SHA256 e4eaae4c91d4574bfade6936c9a702b053f9b510ef5604e49998ae30c142a9bf
MD5 81af29f65b16a3e83b203bc92820ae9b
BLAKE2b-256 a7d1e2e7d7eafc3fdafbca65e83353944fb81f5972d4d45d1c6e9e5ae1ce3d8e

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page