One-way coupled Lagrangian Particle Tracking algorithms.
Project description
lptlib (Lagrangian Particle Tracking Library)
Previously project-arrakis
Python based particle tracking algorithms for CFD data
A highly parallelized set of Lagrangian Particle Tracking (LPT) algorithms based on Python to post-process steady and unsteady CFD data. An advanced programming interface (API) is developed for uncertainty quantification of optical velocimetry data.
Installation
pip install lptlib
Python >= 3.10 is required. Core dependencies include numpy, scipy, matplotlib, pandas, seaborn, tqdm, mpi4py, and scikit-learn.
Overview
lptlib provides building blocks to:
- Read Plot3D grid/flow data (
GridIO,FlowIO) - Locate points in structured curvilinear grids (
Search) - Interpolate flow variables at arbitrary locations (
Interpolation) - Integrate particle paths and streamlines with multiple schemes (
Integration,Streamlines) - Run stochastic, parallel particle simulations (
StochasticModel,Particle,SpawnLocations) - Compute derived variables like velocity, temperature, pressure, Mach, viscosity (
Variables) - Post-process LPT outputs to Eulerian fields and Plot3D files (
DataIO)
Quickstart
Read Plot3D grid and flow
from lptlib.io.plot3dio import GridIO, FlowIO
grid = GridIO('path/to/grid.sp.x')
flow = FlowIO('path/to/sol-0000010.q')
grid.read_grid()
flow.read_flow()
grid.compute_metrics()
Interpolate and integrate a streamline
import numpy as np
from lptlib.streamlines.search import Search
from lptlib.streamlines.interpolation import Interpolation
from lptlib.streamlines.integration import Integration
point = np.array([0.1, 0.05, 0.0])
idx = Search(grid, point)
idx.compute(method='p-space')
interp = Interpolation(flow, idx)
interp.compute(method='p-space')
intg = Integration(interp)
new_point, u = intg.compute(method='pRK4', time_step=1e-3)
One-shot streamline extraction
from lptlib.streamlines.streamlines import Streamlines
sl = Streamlines('path/to/grid.sp.x', 'path/to/sol-0000010.q', [0.1, 0.05, 0.0])
sl.compute(method='p-space')
coords = sl.streamline # list of points
Stochastic parallel run (oblique shock example)
The repository includes a fully working example in main.py that generates an oblique shock test case and launches an adaptive particle tracking simulation in parallel:
python main.py
Key objects used in the example:
ObliqueShock,ObliqueShockDatato synthesize grid/flow for a controlled shock caseParticle,SpawnLocationsto define particle size distribution and seed locationsStochasticModelto run many particles in parallel with adaptive time stepping
DataIO pipeline (Lagrangian → Eulerian)
DataIO reads scattered particle tracks (as .npy per particle), interpolates flow to those points, removes outliers, then interpolates both flow and particle fields onto a structured mesh and writes Plot3D outputs for visualization and downstream tools.
Essential steps:
- Scatter interpolation of flow to particle locations (MPI-parallel)
- Outlier removal and caching of intermediate
.npyfiles underdataio/ - Grid interpolation to a user-defined mesh
- Export to Plot3D:
mgrd_to_p3d.x,mgrd_to_p3d_fluid.q,mgrd_to_p3d_particle.q
See test/test_dataio.py for a minimal, runnable example.
Core API
lptlib.io.plot3dio.GridIOread_grid(data_type='f4'),compute_metrics(),mgrd_to_p3d(...)
lptlib.io.plot3dio.FlowIOread_flow(data_type='f4'),read_unsteady_flow(...),mgrd_to_p3d(...),read_formatted_txt(...)
lptlib.streamlines.Searchcompute(method=...),p2c(ppoint),c2p(cpoint)
lptlib.streamlines.Interpolationcompute(method=...)with options:p-space,c-space,rbf-*,rgi-*,simple_oblique_shock
lptlib.streamlines.Integrationcompute(method=..., time_step=...)withpRK2/4,cRK2/4, unsteady variantscompute_ppath(...)for particle dynamics with drag models (stokes,loth, etc.)
lptlib.streamlines.Streamlines- High-level orchestrator:
compute(method=...), exposesstreamline,fvelocity,svelocity,time
- High-level orchestrator:
lptlib.streamlines.StochasticModel- Parallel execution over many particles:
multi_process(),multi_thread(),mpi_run(),serial()
- Parallel execution over many particles:
lptlib.function.Variablescompute_velocity(),compute_temperature(),compute_pressure(),compute_mach(),compute_viscosity()
lptlib.io.DataIOcompute()end-to-end Lagrangian→Eulerian conversion and Plot3D export
Testing
Run the test suite from the repo root:
pytest -q
Tests cover search, interpolation (steady/unsteady), integration, DataIO, streamlines, plotting, and MPI helpers.
License
Distributed under MIT AND (Apache-2.0 OR BSD-2-Clause). See LICENSE.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file lptlib-0.0.5a10.tar.gz.
File metadata
- Download URL: lptlib-0.0.5a10.tar.gz
- Upload date:
- Size: 60.8 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
ef7b93f5ded571857760108c45f12ea7e2872bbeaf6c3e0052f7bdf332a333a3
|
|
| MD5 |
174ac68cfc7e1d24ab492f610d551cc5
|
|
| BLAKE2b-256 |
721e1c46917d80041e321602e0273a68fe645c7ba6e34eca8264fe6bb33b11eb
|
File details
Details for the file lptlib-0.0.5a10-py3-none-any.whl.
File metadata
- Download URL: lptlib-0.0.5a10-py3-none-any.whl
- Upload date:
- Size: 55.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/6.1.0 CPython/3.10.12
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
cd10710b265f0e8607b2682141318f27eb0d9a9ca865cb4ec47c06da112f5fe9
|
|
| MD5 |
9deabcabb809ba35dc47f26893524801
|
|
| BLAKE2b-256 |
8100cd9b633d763ad6c776be179fe52d508e6d97466e631a54ea145a796f6d28
|