Communicate with Python (or C++) objects across a LAN using something like JSON-RPC
Project description
Tuber Server and Client
Tuber is a C++ server and Python client for exposing an instrumentation control plane across a network.
On a client, you can write Python code like this:
>>> some_resource.increment([1, 2, 3, 4, 5])
[2, 3, 4, 5, 6]
…and end up with a remote method call on a networked resource written in Python or (more usually) C++. The C++ implementation might look like this:
class SomeResource {
public:
std::vector<int> increment(std::vector<int> x) {
std::ranges::for_each(x, [](int &n) { n++; });
return x;
};
};
On the client side, Python needs to know where to find the server. On the server side, the C++ code must be registered with pybind11 (just as any other pybind11 code) and the tuber server. Other than that, however, there is no ceremony and no boilerplate.
Its main features and design principles are:
Pythonic call styles, including *args, **kwargs, and DocStrings.
JSON and CBOR support for efficient and friendly serialization of return values.
“Less-is-more” approach to code. For example, Tuber uses pybind11 and C++ as a shim between C and Python, because the combination gives us the shortest and most expressive way to produce the results we want. It pairs excellently with orjson (as a JSON interface) or cbor2 (as a CBOR interface), which efficiently serialize (for example) NumPy arrays created in C++ across the network.
Schema-less RPC using standard-ish protocols (HTTP 1.1, JSON, CBOR, and something like JSON-RPC). Avoiding a schema allows server and client code to be independently and seamlessly up- and downgraded, with differences between exposed APIs only visible at the sites where RPC calls are made.
A mature, quick-ish, third-party, low-overhead, low-prerequisite embedded webserver. Tuber uses libhttpserver, which in turn, is a C++ wrapper around the well-established libmicrohttpd. We use the thread-per-connection configuration because a single keep-alive connection with a single client is the expected “hot path”; C10K-style server architectures wouldn’t be better.
High performance when communicating with RPC endpoints, using:
HTTP/1.1 Keep-Alive to avoid single-connection-per-request overhead. See this Wikipedia page for details.
A call API that (optionally) allows multiple RPC calls to be combined and dispatched together.
Client-side caches for remote properties (server-side constants)
Python 3.x’s aiohttp/asyncio libraries to asynchronously dispatch across multiple endpoints (e.g. multiple boards in a crate, each of which is an independent Tuber endpoint.)
A friendly interactive experience using Jupyter/IPython-style REPL environments. Tuber servers export metadata that can be used to provide DocStrings and tab-completion for RPC resources.
The ability to serve a web-based UI using static JavaScript, CSS, and HTML.
Anti-goals of this Tuber server include the following:
No authentication/encryption is used. For now, network security is strictly out-of-scope. (Yes, it feels naïve to write this in 2022.)
The additional complexity of HTTP/2 and HTTP/3 protocols are not justified. HTTP/1.1 keep-alive obviates much of the performance gains promised by connection multiplexing.
The performance gains possible using a binary RPC protocol do not justify the loss of a human-readable, browser-accessible JSON protocol.
The use of newer, better languages than C++ (server side) or Python (client side). The instruments Tuber targets are likely to be a polyglot stew, and I am mindful that every additional language or runtime reduces the project’s accessibility to newcomers. Perhaps pybind11 will be eclipsed by something in Rust one day - for now, the ability to make casual cross-language calls is essential to keeping Tuber small. (Exception: the orjson JSON library is a wonderful complement to tuber and I recommend using them together!)
Although the Tuber server hosts an embedded Python interpreter and can expose embedded resources coded in ordinary Python, it is intended to expose C/C++ code. The Python interpeter provides a convenient, Pythonic approach to attribute and method lookup and dispatch without the overhead of a fully interpreted embedded runtime.
Tuber is licensed using the 3-clause BSD license (BSD-3-Clause). This software is intended to be useful, and its licensing is intended to be pragmatic. If licensing is a stumbling block for you, please contact me at gsmecher@threespeedlogic.com.
Installation
Pre-built wheels for Linux and macOS operating systems are available on PyPI for CPython 3.8+:
pip install tuberd
Building from source requires the libmicrohttpd and libhttpserver dependencies. To simplify the build process, the wheels/install_deps.sh script can be used to build all the dependencies locally and compile against them. In this instance, cmake should be able to discover the appropriate paths for all dependencies. Use the BUILD_DEPS cmake argument to trigger this build with pip:
CMAKE_ARGS="-DBUILD_DEPS=yes" pip install tuberd
If you prefer to build the dependencies manually, to ensure that cmake can find the libhttpserver library, you may need to add the path where the FindLibHttpServer.cmake file is installed to the CMAKE_MODULE_PATH option, for example:
CMAKE_ARGS="-DCMAKE_MODULE_PATH=/usr/local/share/cmake/Modules" pip install tuberd
Optional dependencies may be installed to enable alternative encoding schemes (cbor, orjson) with and without numpy support, or the asyncio-enabled client interface:
pip install tuberd[async,cbor,numpy,orjson]
To run the test suite, install the development dependencies:
pip install tuberd[dev]
Client Installation
The above tuberd package includes both the server and client components. If you require just the python components to run the client interface, pre-built wheels of the client code are available on PyPI for Python 3.
pip install tuber-client
To include the dependencies for the asyncio-enabled interface and/or cbor encoding with or without numpy support:
pip install tuber-client[async,cbor,numpy]
Benchmarking
With concurrency 1 and keep-alive enabled, a 1M request benchmark can be generated as follows:
$ sudo apt-get install apache2-utils
$ echo '{ "object":"Wrapper", "method":"increment", "args":[[
1,2,3,4,5,6,7,8,9,10,
1,2,3,4,5,6,7,8,9,10,
1,2,3,4,5,6,7,8,9,10,
1,2,3,4,5,6,7,8,9,10,
1,2,3,4,5,6,7,8,9,10,
1,2,3,4,5,6,7,8,9,10,
1,2,3,4,5,6,7,8,9,10,
1,2,3,4,5,6,7,8,9,10,
1,2,3,4,5,6,7,8,9,10,
1,2,3,4,5,6,7,8,9,10 ]]}' > benchmark.json
$ for n in `seq 100`
do
ab -q -k -n 10000 -c 1 -p benchmark.json -T application/json http://localhost:8080/tuber
done | awk '
BEGIN { delete A }
/Time taken/ { A[length(A)+1] = $5; }
END { printf("x = [ "); for(i in A) printf(A[i] ", "); print "];" }'
These results are formatted suitably for the following Python snippet:
import matplotlib.pyplot as plt
plt.hist(x)
plt.legend()
plt.grid(True)
plt.savefig('histogram.png')
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file tuber_client-0.16.tar.gz
.
File metadata
- Download URL: tuber_client-0.16.tar.gz
- Upload date:
- Size: 25.7 kB
- Tags: Source
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 17ea499e269e39f75d923937d6230599f9748214d905354968ec85798aab9e96 |
|
MD5 | 3f0c189d548e414a8da2aecae6a44f7a |
|
BLAKE2b-256 | 712a6df47d7e80184a69b836e04ba71284ab45fe42f71daf672d4b9d314cb8fb |
File details
Details for the file tuber_client-0.16-py3-none-any.whl
.
File metadata
- Download URL: tuber_client-0.16-py3-none-any.whl
- Upload date:
- Size: 23.5 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? Yes
- Uploaded via: twine/5.1.1 CPython/3.12.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | da3be50d84945ab527b1c1f27b24c47ad66388c5379c6f1c54a1ef6585dd6c85 |
|
MD5 | a33d0b80e319c10eb62e3e6d46d7289a |
|
BLAKE2b-256 | e0f53cc37377eb03ce91a756250680aeee2c0fd1578a6d2c6fd9420949de431e |