Skip to main content

Vectorized transfer matrix method (TMM) for computing the optical reflection and transmission of multilayer planar stacks

Project description

vtmm (vectorized transfer matrix method)

vtmm is a vectorized implementation of the transfer matrix method for computing the optical reflection and transmission of multilayer planar stacks. This package is in beta.

The vtmm package supports some of the same functionality as the tmm Python package developed by Steven Byrnes. However, in vtmm all operations are vectorized over angles / wavevectors as well as frequencies. Due to the small size of the matrices involved in the transfer matrix method (2 x 2), such vectorization results in significant performance gains, especially for large structures and many frequencies / wavevectors.

In some cases we have observed approximately two orders of magnitude difference in execution time between the two implementations (see below). The much lower execution time in vtmm may be useful for applications which require many evaluations of the reflection and transmission coefficients, such as in fitting or optimization.

Gradients

Currently vtmm uses Tensor Flow as its backend. This means that gradients of scalar loss / objective functions of the transmission and reflection can be taken for free. At a later time a numpy backend may be implemented for users that do not need gradient functionality and/or do not want Tensor Flow as a requirement.

Example

The entry point to vtmm is the function tmm_rt(pol, omega, kx, n, d). See the example below for a basic illustration of how to use the package.

import tensorflow as tf
from vtmm import tmm_rt

pol = 's'
n = tf.constant([1.0, 3.5, 1.0]) # Layer refractive indices 
d = tf.constant([2e-6]) # Layer thicknesses 
kx = tf.linspace(0.0, 2*np.pi*220e12/299792458, 1000) # Parallel wavevectors
omega = tf.linspace(150e12, 220e12, 1000) * 2 * np.pi # Angular frequencies

# t and r will be 2D tensors of shape [ num kx, num omega ]
t, r = tmm_rt(pol, omega, kx, n, d)

Benchmarks

See tests/test_benchmark.py for a comparison between vtmm and the non-vectorized tmm package. The benchmarks shown below are for len(omega) == len(kx) == 50 and 75 timeit evaluations.

python -W ignore ./tests/test_benchmark.py
Single omega / kx benchmark
vtmm: 0.2432 s
tmm:  0.0401 s

Large stack benchmark
vtmm: 0.7811 s
tmm:  79.8765 s

Medium stack benchmark
vtmm: 0.4607 s
tmm:  52.2255 s

Small stack benchmark
vtmm: 0.3367 s
tmm:  41.0926 s

Project details


Release history Release notifications | RSS feed

This version

0.1

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

vtmm-0.1.tar.gz (3.7 kB view details)

Uploaded Source

Built Distribution

vtmm-0.1-py3-none-any.whl (5.1 kB view details)

Uploaded Python 3

File details

Details for the file vtmm-0.1.tar.gz.

File metadata

  • Download URL: vtmm-0.1.tar.gz
  • Upload date:
  • Size: 3.7 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/44.0.0.post20200106 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.7.6

File hashes

Hashes for vtmm-0.1.tar.gz
Algorithm Hash digest
SHA256 adfe2994ca57d68849669e78954a7fe01c20aafbc61b5dc2256ae9a19ba3bd81
MD5 f4cb36bc398888349e7ed9f7730cc2ce
BLAKE2b-256 7f754cc6eaec651b1b199ee7d535ba1ed201de18b9a7061dca1084df8335f3c5

See more details on using hashes here.

File details

Details for the file vtmm-0.1-py3-none-any.whl.

File metadata

  • Download URL: vtmm-0.1-py3-none-any.whl
  • Upload date:
  • Size: 5.1 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/1.15.0 pkginfo/1.5.0.1 requests/2.22.0 setuptools/44.0.0.post20200106 requests-toolbelt/0.9.1 tqdm/4.41.1 CPython/3.7.6

File hashes

Hashes for vtmm-0.1-py3-none-any.whl
Algorithm Hash digest
SHA256 3b1ad674681b071022dab9867c5894fb8b0e1eacb7585951998ca694cc1433aa
MD5 a9d6ce8051c2693d141cbd34da3968ca
BLAKE2b-256 57610ebfd6cc1de9e6c760214dfac6eba3eaf26b487676405136160aa7fa7265

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page