Skip to main content

Several tools for zmq services

Project description

Python library with several tools for zmq services. With this library a server with multiple local or distributed workers can be created very easy.

Features:

  • local and / or distributed workers and servers

  • secure authentication with private / public keys

  • automatic worker startup

  • dynamic add or remove workers

  • easy worker configuration for new tasks

Usage

Imports:

import service_tools

Create Certificates:

Options:

  • –path=<path_to_certificates>: path where the certificates are generated

  • –users=<user>: optional; specify for which users a certificate is generated; available options are: all, server, client; default is client

  • –overwrite=<overwrite>: optional; if True overwrite existing directory. default is False

  • create server certificate:

python3 generate_certificates.py --path=<path_to_certificates> users=server --overwrite=<overwrite>
  • create user certificate:

python3 generate_certificates.py --path=<path_to_certificates> users=user --overwrite=<overwrite>
  • create server and user certificate:

python3 generate_certificates.py --path=<path_to_certificates> users=all --overwrite=<overwrite>

Server

A server handles all client requests. The server deals the requests to connected workers which process the request. Several workers can be connected to the server so that requests are computed parallel on all workers.

Server config

A server is generated with a config file. Config files are yml-files.

from service_tools import Server

server_config_path = r'resources\server_config.yml'

# create a new server object
server = Server(config_path=server_config_path)

# start the server
server.start()

In the config file are the following informations:

id

: uuid of the server

name

: name of the server

ip

: ip address of the server; Example 'localhost', '127.0.0.1'

port

: port of the server; Example: '8006', '8007' If no port is specified a free port between 6001 and 6050 is automatically chosen and written in the config file.

backend_port

: backend_port of the server; Example: '9006', '9007' If no port is specified a free port between 9001 and 9050 is automatically chosen and written in the config file.

public_keys_dir

: path to public keys

secret_keys_dir

: path to secret keys

num_workers

: number of workers

auto_start

: Bool if workers start automatically when server is started

worker_config_paths

: list with paths to the worker config files. If only one worker_config_paths is defined but multiple wokers, this worker_config is copied and a new id for each worker is generated and written in the config

worker_script_path

: path to script which is executed to start a worker; see also: Python script for automatic worker start

log_dir

: directory where logs are created

logging_mode

: logging mode: 'DEBUG' 'INFO' 'WARN' 'ERROR'; see python logging

Example for server config:

!ServerConfig
_id: 6b6f2689-0a1c-44da-be28-eff6bc92f723
_name: test server
_secure: true
_public_keys_dir: resources\public_keys
_secret_keys_dir: resources\private_keys
_ip: localhost
_port: 8006
_backend_port: 9003
_num_workers: 4
_auto_start: true
_worker_config_paths:
- worker1_config.yml
_worker_script_path: test_start_worker.py
_log_dir: resources\logging_dir
_logging_mode: INFO
config_path: test_config.yml
_Server__update_file: false
_Server__on_init: false

The server config file can also be generated by a method:

from service_tools.server import ServerConfig, Server

new_server_config = ServerConfig.create_new(
    config_path=r'resources\config.yml',
    name='test server',
    ip='localhost',
    port=8006,
    public_keys_dir=r'resources\public_keys',
    secret_keys_dir=r'resources\private_keys',
    log_dir=r'resources\logging_dir',
    logging_mode='INFO',
    num_workers=4,
    auto_start=False,
    worker_config_paths=[r'resources\worker1_config.yml'],
    worker_script_path=r'resources\test_start_worker.py')

# start the server with the new created server config
server = Server(config_path=new_server_config.config_path)
server.start()

Worker

Workers, like servers, are create with a config file:

from service_tools import Worker

worker_config_path = r'resources\worker_config.yml'

# create a new worker
worker = Worker(config_path=server_config_path)

# start the worker
worker.start()

In the config file are the following information:

id

: uuid of the worker

name

: name of the worker

ip

: ip address of the worker; Example 'localhost', '127.0.0.1'

port

: port of the worker; Example: '9006', '9007'

public_keys_dir

: path to public keys; optional

secret_keys_dir

: path to secret keys; optional

python_path

: path to the python executable with which the worker should be started; optional

log_dir

: directory where logs are created

logging_mode

: logging mode: 'DEBUG' 'INFO' 'WARN' 'ERROR'; see python logging

Example for worker config:

!WorkerConfig
_id: c49b82f2-5eee-495d-92fa-f28a5bdcc7fa
_name: test worker
_public_keys_dir:
_secret_keys_dir:
_ip: localhost
_port: 9049
_python_path: python
_log_dir: F:\OneDrive\PythonProjects\service_tools\tests\test_outputs\logging_dir
_logging_mode: INFO

The worker config file can also be generated by a method:

from service_tools.worker import WorkerConfig

new_worker_config = WorkerConfig.create_new(
    config_path=r'resources\worker1_config.ini',
    name='test worker',
    ip=None,
    port=None,
    public_keys_dir=None,
    secret_keys_dir=None,
    log_dir=r'resources\logging_dir',
    logging_mode='INFO')

# start the server with the new created server config
worker = Worker(config_path=new_worker_config.config_path)
worker.start()

Worker functionality

Workers by default have no functionality. To add functionality create a class inherited from the Worker class. There are many ways how functionality can be added.

One way ist to create a Message class:

class Message(object):

    def __init__(self, *args, **kwargs):

        self.method = kwargs.get('method', None)
        self.args = kwargs.get('args', list())
        self.kwargs = kwargs.get('args', dict())

The client then sends a message with a method specified as string. The worker receives the message and selects its method to execute with:

method = getattr(self, message.method)

This method is then executed with the args and kwargs in the message and the return value is returned to the client:

return(method(*message.args, **message.kwargs))

Example: create a worker with the functionality ‘return_non_sense’ which returns ‘non sense’

Worker:

from service_tools import Worker

class ExtendedWorker(Worker):

    def __init__(self, *args, **kwargs):

        Worker.__init__(self, *args, **kwargs)

    def return_non_sense(self, *args, **kwargs):

        return 'non sense'

Client:

import zmq

class Message(object):

    def __init__(self, *args, **kwargs):

        self.method = kwargs.get('method', None)
        self.args = kwargs.get('args', list())
        self.kwargs = kwargs.get('args', dict())

ctx = zmq.Context.instance()
client = ctx.socket(zmq.REQ)

client_public, client_secret = zmq.auth.load_certificate(
r'resources\private_keys\5520471c-66c1-4605-80dc-a9ff84d959da.key_secret')
server_public, _ = zmq.auth.load_certificate(
    r'resources\public_keys\server.key')
client.curve_secretkey = client_secret
client.curve_publickey = client_public
client.curve_serverkey = server_public

client.connect('tcp://localhost:8006')

message = Message(method='return_non_sense')
client.send_pyobj(message)
return_value = client.recv_pyobj()

Python script for automatic worker start

Here is a script which starts a worker when executed. The path to the config is given by –config_file argument.

from configparser import ConfigParser
import argparse
import os
import uuid

# Import a worker. In real case this would be a custom worker with additional functionality.
from src.service_tools.worker import Worker
# Import of message is necessary!
from src.service_tools.message import Message

if __name__ == '__main__':

    parser = argparse.ArgumentParser()

    parser.add_argument('--config_file', required=True, help="worker config file", type=str)
    args = parser.parse_args()
    config_file = args.config_file

    print(f'reading config file: {config_file}')
    if not os.path.isfile(config_file):
        raise FileExistsError(f'{config_file} does not exist')
    config = ConfigParser()
    config.read(config_file)

    try:
        name = config.get('main', 'name', fallback=None)
    except Exception as e:
        name = ''

    try:
        id = uuid.UUID(config.get('main', 'id', fallback=None))
    except Exception as e:
        id = ''

    try:
        ip = config.get('main', 'ip', fallback=None)
    except Exception as e:
        print('ip in {self.config_path} does not exist. Assume localhost...')
        ip = 'localhost'

    print(f'starting worker: \n     name: {name} \n     id: {id} \n     ip: {ip}')
    new_worker = Worker(config_path=config_file)
    new_worker.start()

Requirements

Python 3.7+.

Windows Support

Summary: On Windows, use py instead of python3 for many of the examples in this documentation.

This package fully supports Windows, along with Linux and macOS, but Python is typically installed differently on Windows. Windows users typically access Python through the py launcher rather than a python3 link in their PATH. Within a virtual environment, all platforms operate the same and use a python link to access the Python version used in that virtual environment.

Dependencies

Dependencies are defined in:

  • requirements.in

  • requirements.txt

  • dev-requirements.in

  • dev-requirements.txt

Virtual Environments

It is best practice during development to create an isolated Python virtual environment using the venv standard library module. This will keep dependant Python packages from interfering with other Python projects on your system.

On *Nix:

$ python3 -m venv venv
$ source venv/bin/activate

On Windows cmd:

> py -m venv venv
> venv\Scripts\activate.bat

Once activated, it is good practice to update core packaging tools (pip, setuptools, and wheel) to the latest versions.

(venv) $ python -m pip install --upgrade pip setuptools wheel

Packaging

This project is designed as a Python package, meaning that it can be bundled up and redistributed as a single compressed file.

Packaging is configured by:

  • pyproject.toml

  • setup.py

  • MANIFEST.in

To package the project as both a source distribution and a wheel:

(venv) $ python setup.py sdist bdist_wheel

This will generate dist/fact-1.0.0.tar.gz and dist/fact-1.0.0-py3-none-any.whl.

Read more about the advantages of wheels to understand why generating wheel distributions are important.

Upload Distributions to PyPI

Source and wheel redistributable packages can be uploaded to PyPI or installed directly from the filesystem using pip.

To upload to PyPI:

(venv) $ python -m pip install twine
(venv) $ twine upload dist/*

Testing

Automated testing is performed using tox. tox will automatically create virtual environments based on tox.ini for unit testing, PEP8 style guide checking, and documentation generation.

# Run all environments.
#   To only run a single environment, specify it like: -e pep8
# Note: tox is installed into the virtual environment automatically by pip-sync command above.
(venv) $ tox

Unit Testing

Unit testing is performed with pytest. pytest has become the defacto Python unit testing framework. Some key advantages over the built in unittest module are:

  1. Significantly less boilerplate needed for tests.

  2. PEP8 compliant names (e.g. pytest.raises() instead of self.assertRaises()).

  3. Vibrant ecosystem of plugins.

pytest will automatically discover and run tests by recursively searching for folders and .py files prefixed with test for any functions prefixed by test.

The tests folder is created as a Python package (i.e. there is an __init__.py file within it) because this helps pytest uniquely namespace the test files. Without this, two test files cannot be named the same, even if they are in different sub-directories.

Code coverage is provided by the pytest-cov plugin.

When running a unit test tox environment (e.g. tox, tox -e py37, etc.), a data file (e.g. .coverage.py37) containing the coverage data is generated. This file is not readable on its own, but when the coverage tox environment is run (e.g. tox or tox -e -coverage), coverage from all unit test environments is combined into a single data file and an HTML report is generated in the htmlcov folder showing each source file and which lines were executed during unit testing. Open htmlcov/index.html in a web browser to view the report. Code coverage reports help identify areas of the project that are currently not tested.

Code coverage is configured in pyproject.toml.

To pass arguments to pytest through tox:

(venv) $ tox -e py37 -- -k invalid_factorial

Code Style Checking

PEP8 is the universally accepted style guide for Python code. PEP8 code compliance is verified using flake8. flake8 is configured in the [flake8] section of tox.ini. Extra flake8 plugins are also included:

  • pep8-naming: Ensure functions, classes, and variables are named with correct casing.

Automated Code Formatting

Code is automatically formatted using black. Imports are automatically sorted and grouped using isort.

These tools are configured by:

  • pyproject.toml

To automatically format code, run:

(venv) $ tox -e fmt

To verify code has been formatted, such as in a CI job:

(venv) $ tox -e fmt-check

Generated Documentation

Documentation that includes the README.rst and the Python project modules is automatically generated using a Sphinx tox environment. Sphinx is a documentation generation tool that is the defacto tool for Python documentation. Sphinx uses the RST markup language.

This project uses the napoleon plugin for Sphinx, which renders Google-style docstrings. Google-style docstrings provide a good mix of easy-to-read docstrings in code as well as nicely-rendered output.

"""Computes the factorial through a recursive algorithm.

Args:
    n: A positive input value.

Raises:
    InvalidFactorialError: If n is less than 0.

Returns:
    Computed factorial.
"""

The Sphinx project is configured in docs/conf.py.

Build the docs using the docs tox environment (e.g. tox or tox -e docs). Once built, open docs/_build/index.html in a web browser.

Generate a New Sphinx Project

To generate the Sphinx project shown in this project:

# Note: Sphinx is installed into the virtual environment automatically by pip-sync command
# above.
(venv) $ mkdir docs
(venv) $ cd docs
(venv) $ sphinx-quickstart --no-makefile --no-batchfile --extensions sphinx.ext.napoleon
# When prompted, select all defaults.

Modify conf.py appropriately:

# Add the project's Python package to the path so that autodoc can find it.
import os
import sys
sys.path.insert(0, os.path.abspath('../src'))

...

html_theme_options = {
    # Override the default alabaster line wrap, which wraps tightly at 940px.
    'page_width': 'auto',
}

Modify index.rst appropriately:

.. include:: ../README.rst

apidoc/modules.rst

Project Structure

Traditionally, Python projects place the source for their packages in the root of the project structure, like:

fact
├── fact
│   ├── __init__.py
│   ├── cli.py
│   └── lib.py
├── tests
│   ├── __init__.py
│   └── test_fact.py
├── tox.ini
└── setup.py

However, this structure is known to have bad interactions with pytest and tox, two standard tools maintaining Python projects. The fundamental issue is that tox creates an isolated virtual environment for testing. By installing the distribution into the virtual environment, tox ensures that the tests pass even after the distribution has been packaged and installed, thereby catching any errors in packaging and installation scripts, which are common. Having the Python packages in the project root subverts this isolation for two reasons:

  1. Calling python in the project root (for example, python -m pytest tests/) causes Python to add the current working directory (the project root) to sys.path, which Python uses to find modules. Because the source package fact is in the project root, it shadows the fact package installed in the tox environment.

  2. Calling pytest directly anywhere that it can find the tests will also add the project root to sys.path if the tests folder is a a Python package (that is, it contains a __init__.py file). pytest adds all folders containing packages to sys.path because it imports the tests like regular Python modules.

In order to properly test the project, the source packages must not be on the Python path. To prevent this, there are three possible solutions:

  1. Remove the __init__.py file from tests and run pytest directly as a tox command.

  2. Remove the __init__.py file from tests and change the working directory of python -m pytest to tests.

  3. Move the source packages to a dedicated src folder.

The dedicated src directory is the recommended solution by pytest when using tox and the solution this blueprint promotes because it is the least brittle even though it deviates from the traditional Python project structure. It results is a directory structure like:

fact
├── src
│   └── fact
│       ├── __init__.py
│       ├── cli.py
│       └── lib.py
├── tests
│   ├── __init__.py
│   └── test_fact.py
├── tox.ini
└── setup.py

Type Hinting

Type hinting allows developers to include optional static typing information to Python source code. This allows static analyzers such as PyCharm, mypy, or pytype to check that functions are used with the correct types before runtime.

For PyCharm in particular, the IDE is able to provide much richer auto-completion, refactoring, and type checking while the user types, resulting in increased productivity and correctness.

This project uses the type hinting syntax introduced in Python 3:

def factorial(n: int) -> int:

Type checking is performed by mypy via tox -e mypy. mypy is configured in setup.cfg.

Licensing

Licensing for the project is defined in:

  • LICENSE.txt

  • setup.py

This project uses a common permissive license, the MIT license.

You may also want to list the licenses of all of the packages that your Python project depends on. To automatically list the licenses for all dependencies in requirements.txt (and their transitive dependencies) using pip-licenses:

(venv) $ tox -e licenses
...
 Name        Version  License
 colorama    0.4.3    BSD License
 exitstatus  1.3.0    MIT License

PyCharm Configuration

To configure PyCharm 2018.3 and newer to align to the code style used in this project:

  • Settings | Search “Hard wrap at”

    • Editor | Code Style | General | Hard wrap at: 99

  • Settings | Search “Optimize Imports”

    • Editor | Code Style | Python | Imports

      • ☑ Sort import statements

        • ☑ Sort imported names in “from” imports

        • ☐ Sort plain and “from” imports separately within a group

        • ☐ Sort case-insensitively

      • Structure of “from” imports

        • ◎ Leave as is

        • ◉ Join imports with the same source

        • ◎ Always split imports

  • Settings | Search “Docstrings”

    • Tools | Python Integrated Tools | Docstrings | Docstring Format: Google

  • Settings | Search “Force parentheses”

    • Editor | Code Style | Python | Wrapping and Braces | “From” Import Statements

      • ☑ Force parentheses if multiline

Integrate Code Formatters

To integrate automatic code formatters into PyCharm, reference the following instructions:

  • black integration

    • The File Watchers method (step 3) is recommended. This will run black on every save.

  • isort integration

    • The File Watchers method (option 1) is recommended. This will run isort on every save.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

service_tools-0.32.tar.gz (32.7 kB view details)

Uploaded Source

Built Distribution

service_tools-0.32-py3-none-any.whl (24.1 kB view details)

Uploaded Python 3

File details

Details for the file service_tools-0.32.tar.gz.

File metadata

  • Download URL: service_tools-0.32.tar.gz
  • Upload date:
  • Size: 32.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.7.0 requests/2.25.1 setuptools/58.1.0 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.6

File hashes

Hashes for service_tools-0.32.tar.gz
Algorithm Hash digest
SHA256 7fe417f607e1e5b9dcc1fa6465e69a9aefc02f52256ed3d3d3d939cd1b7f2ca9
MD5 10ee6a124d4ec9ba072217474944032d
BLAKE2b-256 22d11ee35a77133a24d0ff99836370a3b7e8ff11f418fb3831c6543156e64eae

See more details on using hashes here.

File details

Details for the file service_tools-0.32-py3-none-any.whl.

File metadata

  • Download URL: service_tools-0.32-py3-none-any.whl
  • Upload date:
  • Size: 24.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.7.0 requests/2.25.1 setuptools/58.1.0 requests-toolbelt/0.9.1 tqdm/4.60.0 CPython/3.8.6

File hashes

Hashes for service_tools-0.32-py3-none-any.whl
Algorithm Hash digest
SHA256 95980cac21e7e13779915f5e522d14c61212e6d547c01d083ee3fbd9e1317a4f
MD5 bf06c216a981682174eda1e327901eaa
BLAKE2b-256 e3765af6ee5e3d65df810786a537443a8f4956d1d73cf2ad3f8368bcc1784ba3

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page