Skip to main content

Witwin Radar - Radar signal simulation

Project description

WiTwin Radar - Differentiable Radar Simulator

A GPU-accelerated, differentiable FMCW radar simulator for generating synthetic radar data from 3D scenes. It combines Mitsuba ray tracing with custom CUDA kernels for scene simulation, signal generation, and downstream radar processing.

This module is derived from RF-Genesis.

Get Started

Python 3.10+ and an NVIDIA GPU are required. This package depends on the base witwin package.

pip install witwin[radar]

Quick Start

import numpy as np
import torch

from witwin.radar import Radar, RadarConfig
from witwin.radar.sigproc import process_pc, process_rd

# FMCW radar configuration.
config = {
    "num_tx": 3,
    "num_rx": 4,
    "fc": 77e9,
    "slope": 60.012,
    "adc_samples": 256,
    "adc_start_time": 6,
    "sample_rate": 4400,
    "idle_time": 7,
    "ramp_end_time": 65,
    "chirp_per_frame": 128,
    "frame_per_second": 10,
    "num_doppler_bins": 128,
    "num_range_bins": 256,
    "num_angle_bins": 64,
    "power": 15,
    "tx_loc": [[0, 0, 0], [4, 0, 0], [2, 1, 0]],
    "rx_loc": [[-6, 0, 0], [-5, 0, 0], [-4, 0, 0], [-3, 0, 0]],
}

# Use the recommended GPU backend.
radar = Radar(
    RadarConfig.from_dict(config),
    backend="dirichlet",
    device="cuda",
    position=(0.0, 0.0, 0.0),
    target=(0.0, 0.0, -5.0),
    fov=60.0,
)

point = np.array([[0.0, 0.0, -3.0]], dtype=np.float32)
velocity = np.array([[0.0, 0.0, 0.01]], dtype=np.float32)


def interp(t):
    # Return target intensity and position at time t.
    positions = torch.tensor(point + velocity * t, dtype=torch.float32, device=radar.device)
    intensities = torch.ones((positions.shape[0],), dtype=torch.float32, device=radar.device)
    return intensities, positions


# Simulate one frame, then extract point cloud and RD map.
frame = radar.mimo(interp, t0=0)
pc = process_pc(radar, frame)
rd, _, ranges, vels = process_rd(radar, frame)

Scene API

Use Radar(..., position=..., target=..., fov=...) to define the radar pose, and Scene.add_* methods for scene assembly.

from witwin.core import Material, Structure
from witwin.radar import Radar, RadarConfig, Scene

radar = Radar(
    RadarConfig.from_dict(config),
    backend="dirichlet",
    device="cuda",
    position=(0.0, 0.0, 0.0),
    target=(0.0, 0.0, -1.0),
    fov=60.0,
)
scene = Scene(device="cpu")

scene.add_structure(
    Structure(
        name="car_body",
        geometry=car_body_mesh,
        material=Material(eps_r=3.0),
    )
)
scene.add_mesh(name="wheel_fl", vertices=wheel_vertices, faces=wheel_faces, dynamic=True)
scene.add_structure_motion(
    "wheel_fl",
    rotation=RotationMotion(
        axis=(0.0, 1.0, 0.0),
        angular_velocity=32.0,
        origin=(0.0, 0.0, 0.0),
        space="local",
    ),
)

frame = radar.simulate(
    scene,
    sampling="triangle",
    motion_sampling="per_chirp",
)

Available mutating scene methods:

  • Scene.add_structure(...)
  • Scene.add_mesh(...)
  • Scene.add_smpl(...)
  • Scene.add_structure_motion(...)
  • Scene.update_structure(...)
  • Scene.set_structure_motion(...)
  • Scene.clear_structure_motion(...)

Features

  • Recommended backend: dirichlet
  • Ray tracing through Mitsuba with differentiable scene support
  • Shared-core geometry and structure primitives
  • SMPL body support through Scene.add_smpl(...)
  • Optional per-structure rigid motion with parent inheritance
  • Multi-radar orchestration through Radar.simulate_group(...)
  • Torch-native DSP pipeline for range/Doppler processing and point-cloud extraction
  • Optional antenna pattern, polarization, noise-model, and receiver-chain configuration

Running Tests

cd radar
pytest tests/
pytest tests/ --gpu

Examples

Run the maintained Python examples from the radar/ root:

python -m examples.single_point
python -m examples.mesh_scene
python -m examples.humanbody
python -m examples.music_imaging
python -m examples.amass_pointcloud
python -m examples.gen_amass_video
python -m examples.rgbd_range_doppler --input path/to/depths.npy

amass_pointcloud and gen_amass_video additionally require AMASS BMLmovi data under data/BMLmovi_full/BMLmovi/. The rendering examples require mitsuba and CUDA; the SMPL examples also require models/smpl_models/. rgbd_range_doppler reads .npy/.npz depth or point-cloud sequences, and can read Azure Kinect .mkv files when pykinect_azure is installed. It assumes the depth camera view is the radar view by default.

Installation

Python 3.10+ and an NVIDIA GPU are required.

pip install witwin[radar]

Core dependencies include torch, numpy, slangtorch, tqdm, matplotlib, and scipy. Optional rendering dependencies are mitsuba and drjit.

Citation

If this module or its original RF-Genesis work is relevant to your research, please cite:

@inproceedings{chen2023rfgenesis,
  author = {Chen, Xingyu and Zhang, Xinyu},
  title = {RF Genesis: Zero-Shot Generalization of mmWave Sensing through Simulation-Based Data Synthesis and Generative Diffusion Models},
  booktitle = {ACM Conference on Embedded Networked Sensor Systems (SenSys '23)},
  year = {2023},
  pages = {1-14},
  address = {Istanbul, Turkiye},
  publisher = {ACM, New York, NY, USA},
  url = {https://doi.org/10.1145/3625687.3625798},
  doi = {10.1145/3625687.3625798}
}

License

MIT

Developer

Xingyu Chen

Xingyu Chen

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

witwin_radar-0.0.2.tar.gz (1.1 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

witwin_radar-0.0.2-py3-none-any.whl (56.4 kB view details)

Uploaded Python 3

File details

Details for the file witwin_radar-0.0.2.tar.gz.

File metadata

  • Download URL: witwin_radar-0.0.2.tar.gz
  • Upload date:
  • Size: 1.1 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for witwin_radar-0.0.2.tar.gz
Algorithm Hash digest
SHA256 69fb0fd825eadbf0f3e119b094f8b96172fcafae756ccc7e6def2630163a622b
MD5 500aa45bcfd22a8af6922bf727a4835d
BLAKE2b-256 7d96d5227f1c2227f33be34c48ad1abd1c3c1c2b2ba02c7741cdb88ca7c2c65f

See more details on using hashes here.

Provenance

The following attestation bundles were made for witwin_radar-0.0.2.tar.gz:

Publisher: publish-witwin-radar.yml on witwin-ai/witwin-radar

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file witwin_radar-0.0.2-py3-none-any.whl.

File metadata

  • Download URL: witwin_radar-0.0.2-py3-none-any.whl
  • Upload date:
  • Size: 56.4 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.12

File hashes

Hashes for witwin_radar-0.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 db71b12fff2e49250c693f798dba6209faf951c7855f7e261828d38b3559d715
MD5 de0e893caf056872d53b4b69aa8b586f
BLAKE2b-256 c864e5ef64e611baf73a3a4ef1899d3aa1ef4aeb2f3284e5f68af1e16033ac90

See more details on using hashes here.

Provenance

The following attestation bundles were made for witwin_radar-0.0.2-py3-none-any.whl:

Publisher: publish-witwin-radar.yml on witwin-ai/witwin-radar

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page