Skip to main content

A framework for structural shape optimization based on automatic differentiation (AD) and the adjoint method, enabled by JAX

Project description

JaxSSO

A framework for structural shape optimization based on automatic differentiation (AD) and the adjoint method, enabled by JAX.

Developed by Gaoyuan Wu @ Princeton.

We have a preprint under review where you can find details regarding this framework. Please share our project with others and cite us if you find it interesting and helpful. Cite us using:

@misc{https://doi.org/10.48550/arxiv.2211.15409,
  doi = {10.48550/ARXIV.2211.15409},
  url = {https://arxiv.org/abs/2211.15409},
  author = {Wu, Gaoyuan},
  title = {A framework for structural shape optimization based on automatic differentiation, the adjoint method and accelerated linear algebra},
  publisher = {arXiv},
  year = {2022},
}

Features

  • Automatic differentiation (AD): an easy and accurate way for gradient evaluation. The implementation of AD avoids deriving derivatives manually or trauncation errors from numerical differentiation.
  • Acclerated linear algebra (XLA) and just-in-time compilation: these features in JAX boost the gradient evaluation
  • Hardware acceleration: run on GPUs and TPUs for faster experience.
  • Form finding based on finite element analysis (FEA) and optimization theory

Here is an implementation of JaxSSO to form-find a structure inspired by Mannheim Multihalle using simple gradient descent. (First photo credit to Daniel Lukac) alt text alt text

Background: shape optimization

We consider the minimization of the strain energy by changing the shape of structures, which is equivalent to maximizing the stiffness and reducing the bending in the structure. The mathematical formulation of this problem is as follows, where no additional constraints are considered. $$\text{minimize} \quad C(\mathbf{x}) = \frac{1}{2}\int\sigma\epsilon \mathrm{d}V = \frac{1}{2}\mathbf{f}^\mathrm{T}\mathbf{u}(\mathbf{x}) $$ $$\text{subject to: } \quad \mathbf{K}(\mathbf{x})\mathbf{u}(\mathbf{x}) =\mathbf{f}$$ where $C$ is the compliance, which is equal to the work done by the external load; $\mathbf{x} \in \mathbb{R}^{n_d}$ is a vector of $n_d$ design variables that determine the shape of the structure; $\sigma$, $\epsilon$ and $V$ are the stress, strain and volume, respectively; $\mathbf{f} \in \mathbb{R}^n$ and $\mathbf{u}(\mathbf{x}) \in \mathbb{R}^n$ are the generalized load vector and nodal displacement of $n$ structural nodes; $\mathbf{K} \in \mathbb{R}^{6n\times6n}$ is the stiffness matrix. The constraint is essentially the governing equation in finite element analysis (FEA).

To implement gradient-based optimization, one needs to calculate $\nabla C$. By applying the adjoint method, the entry of $\nabla C$ is as follows: $$\frac{\partial C}{\partial x_i}=-\frac{1}{2}\mathbf{u}^\mathrm{T}\frac{\partial \mathbf{K}}{\partial x_i}\mathbf{u}$$ The use of the adjoint method: i) reduces the computation complexity and ii) decouples FEA and the derivative calculation of the stiffness matrix $\mathbf K$. To get $\nabla C$:

  1. Conduct FEA to get $\mathbf u$
  2. Conduct sensitivity analysis to get $\frac{\partial \mathbf{K}}{\partial x_i}$.

Usage

Installation

Install it with pip: pip install JaxSSO

Dependencies

JaxSSO is written in Python and requires:

  • numpy >= 1.22.0.
  • JAX: "JAX is Autograd and XLA, brought together for high-performance machine learning research." Please refer to this link for the installation of JAX.
  • Nlopt: Nlopt is a library for nonlinear optimization. It has Python interface, which is implemented herein. Refer to this link for the installation of Nlopt. Alternatively, you can use pip install nlopt, please refer to nlopt-python.
  • scipy.

Quickstart

The project provides you with interactive examples with Google Colab for quick start:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

JaxSSO-0.0.6.tar.gz (12.6 kB view details)

Uploaded Source

Built Distribution

JaxSSO-0.0.6-py3-none-any.whl (15.0 kB view details)

Uploaded Python 3

File details

Details for the file JaxSSO-0.0.6.tar.gz.

File metadata

  • Download URL: JaxSSO-0.0.6.tar.gz
  • Upload date:
  • Size: 12.6 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.8

File hashes

Hashes for JaxSSO-0.0.6.tar.gz
Algorithm Hash digest
SHA256 5f59c83e0eede09448107cc86c82e5d427c2e1fab79fd7f8b8af075632149150
MD5 254e19ba658c4f9f1124bf01b758ea9d
BLAKE2b-256 c759ed59c7180e208b0e6a355554067be4284b6b78cb8674a8d64efea23296ca

See more details on using hashes here.

File details

Details for the file JaxSSO-0.0.6-py3-none-any.whl.

File metadata

  • Download URL: JaxSSO-0.0.6-py3-none-any.whl
  • Upload date:
  • Size: 15.0 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.8.8

File hashes

Hashes for JaxSSO-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 305c51120f29eb2402aaf2f48f9879e1de010767b09de5fd257c4ae5d9db26d1
MD5 91f00816207442c8820ce04e136ec1d8
BLAKE2b-256 ea1aeda2cb3e50f6e4999aff6fa3b7dfb981e16166a0ff38dd9a5256c9309e95

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page