Skip to main content

Python package for efficient probabilistic polynomial approximation of arbitrary functions.

Project description

chiku

Efficient Probabilistic Polynomial Function Approximation Python Library.

Installation

To install run: pip install chiku

Approximation Libraries

Complex (non-linear) functions like Sigmoid ( $\sigma(x)$ ) and Hyperbolic Tangent ( $\tanh{x}$ ) can be computed with Fully Homomorphic Encryption (FHE) in an encrypted domain using piecewise-linear functions (a linear approximation of $\sigma(x) = 0.5 + 0.25x$ can be derived from the first two terms of Taylor series $\frac{1}{2} + \frac{1}{4}x$ ) or polynomial approximations like Taylor, Pade, Chebyshev, Remez, and Fourier series. These deterministic approaches yield the same polynomial for the same function. In contrast, we propose to use Artificial Neural Network ( $ANN$ ) to derive the approximation polynomial probabilistically, where the coefficients are based on the initial weights and convergence of the $ANN$ model. Our scheme is publicly available here as an open-source Python package.

Library Taylor Fourier Pade Chebyshev Remez ANN
numpy
scipy
mpmath
chiku

The table above compares our library with other popular Python packages for numerical analysis. While the $mpmath$ library provides Taylor, Pade, Fourier, and Chebyshev approximations, a user has to transform the functions to suit the $mpmath$ datatypes (e.g., $mpf$ for real float and $mpc$ for complex values). In contrast, our library requires no modifications and can approximate arbitrary functions. Additionally, we provide Remez approximation along with the other methods supported by the $mpmath$.

ANN Approximation

While $ANN$ are known for their universal function approximation properties, they are often treated as a black box and used to calculate the output value. We propose to use a basic 3-layer perceptron consisting of an input layer, a hidden layer, and an output layer; both hidden and output layers have linear activations to generate the coefficients for an approximation polynomial of a given order. In this architecture, the input layer is dynamic, with the input nodes corresponding to the desired polynomial degrees. While having a variable number of hidden layers is possible, we fix it to a single layer with a single node to minimize the computation.

Polynomial approximation using ANN

We show coefficient calculations for a third-order polynomial $d=3$ for a univariate function $f(x) = y$ for an input $x$, actual output $y$, and predicted output $y_{out}$.

Input layer weights are

${w_1, w_2, \ldots, w_d} = {w_1, w_2, w_3} = {x, x^2, x^3}$

and biases are ${b_1, b_2, b_3} = b_h$. Thus the output of the hidden layer is

$y_h = w_1 x + w_2 x^2 + w_3 x^3 + b_h$

The predicted output is calculated by

$y_{out} = w_{out} \cdot y_h + b_{out}$ $= w_1 w_{out} x + w_2 w_{out} x^2 + w_3 w_{out} x^3 + (b_h w_{out} + b_{out})$

where the layer weights ${w_1 w_{out}, w_2 w_{out}, w_3 w_{out}}$ are the coefficients for the approximating polynomial of order-3

and the constant term is $b_h w_{out} + b_{out}$.

Our polynomial approximation approach using $ANN$ can generate polynomials with specified degrees. E.g., a user can generate a complete third-order polynomial for $\sin(x)$, which yields a polynomial $-0.0931199x^3 - 0.001205849x^2 + 0.85615075x + 0.0009873845$ in the interval $[-\pi,\pi]$. Meanwhile, a user may want to optimize the above polynomial by eliminating the coefficients for $x^2$ to reduce costly multiplications in FHE, which yields the following: $-0.09340597x^3 + 0.8596622x + 0.0005142888.$

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

chiku-0.0.25.tar.gz (24.1 kB view details)

Uploaded Source

Built Distribution

chiku-0.0.25-py3-none-any.whl (23.6 kB view details)

Uploaded Python 3

File details

Details for the file chiku-0.0.25.tar.gz.

File metadata

  • Download URL: chiku-0.0.25.tar.gz
  • Upload date:
  • Size: 24.1 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for chiku-0.0.25.tar.gz
Algorithm Hash digest
SHA256 eba9c09c658720cdce752f665ddde7685119194fe63d685d910252e157da52dd
MD5 9dc61f5bc4f2b949ef7576924fab3649
BLAKE2b-256 adc4fee0f666ba772dab8abbd33827822a5930a349e18f212f9cb41055ba408a

See more details on using hashes here.

File details

Details for the file chiku-0.0.25-py3-none-any.whl.

File metadata

  • Download URL: chiku-0.0.25-py3-none-any.whl
  • Upload date:
  • Size: 23.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.2 CPython/3.11.4

File hashes

Hashes for chiku-0.0.25-py3-none-any.whl
Algorithm Hash digest
SHA256 c56184e46ab7f6971192e44937dd207af2814a6455164b24ea8994888605ef75
MD5 38d77519fe2b9e3561748ac9cc00397e
BLAKE2b-256 4988f5674ce0aef9090d92866d211f110ce11180aa29f3433f245c0a3de8cf57

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page