Skip to main content

Modular and flexible jaw motion analysis framework (motion capture, calibration, registration, and analysis)

Project description

JawTrackingSystem (JTS): A customizable, low-cost, optical jaw tracking system

A modular and extensible Python package for analyzing jaw motion using motion capture data. Designed for research and clinical applications, it provides a flexible pipeline for calibration, coordinate transformations, registration, smoothing, visualization, and export of jaw kinematics. The models for the hardware components are provided as STL files and inside a FreeCAD project file.


Table of Contents


Features

  • Customizable Hardware: 3D-printable, low-cost components for jaw tracking
  • Flexible Analysis Pipeline: Calibration, relative motion, coordinate transformation, smoothing, export
  • Motion Capture Support: Abstract base classes for Qualisys (extensible to other systems)
  • Real-time & Offline: Supports both offline analysis and real-time streaming (in development)
  • HDF5 Analysis Tools:
    • Split recordings by sub-experiments with automatic frame offset handling
    • Plot derivatives (velocity, acceleration) alongside trajectories
    • Compare raw vs smoothed data with comprehensive visualization
  • Easy Configuration: JSON-based configuration system
  • Comprehensive Testing: Test suite for core functionality (24 tests)
  • Well Documented: Complete API reference and examples

Hardware

The hardware components are designed to be low-cost and customizable. The models for the hardware components are provided as STL files and inside a FreeCAD project file. You can find the files in the models directory.

The mouthpiece, teeth attachment, headpiece, and digitizing pointer are designed to be 3D-printed. Since it isn't easy to 3D-print a sharp point for the digitizing pointer, a dart point is used, which can be attached to a 2BA thread connected to the digitizing pointer's tip. For the reflective markers, you can use reflective fibers or reflective tape. The headpiece can be attached and fastened to the head using hook-and-loop tape (see Components).

Components

Mouthpiece Teeth attachment Headpiece Digitizing pointer
2BA thread Dart point Reflective fiber Temporary dental glue Hook-and-loop tape

Installation

Install the package using pip:

python -m pip install jaw-tracking-system

From GitHub:

python -m pip install git+https://github.com/paulotto/jaw_tracking_system.git

Or just clone the repository, copy the jts directory to your project, and install the dependencies:

git clone https://github.com/paulotto/jaw_tracking_system.git 
cd jaw_tracking_system
cp -r jts your_project_directory/
python -m pip install -r requirements.txt

Optional Dependencies

python -m pip install plotly==6.0.1 qtm_rt

Quick Start

  1. Prepare a configuration JSON file (see README for examples).
  2. Run the analysis pipeline:
python -m jts.core path/to/config.json
  1. Results (trajectories, plots, exports) will be saved to the output directory specified in your config.

Configuration

All analysis parameters are specified in a JSON config file. Key sections include:

  • data_source: Type (e.g., "qualisys"), filename, and system-specific parameters
  • analysis: Calibration, experiment intervals, smoothing, coordinate transforms
  • output: Output directory, file formats, export options
  • visualization: Plotting options

See config.json for a template.

Setup and Usage

This section describes the complete workflow from hardware assembly to running the analysis pipeline.

1. Hardware Assembly

3D Printing the Components

Print the following components from the STL files in the models/ directory:

Component STL File Recommended Material Notes
Mouthpiece (MTA) JTS_Mouth_Marker.stl Standard PLA/PETG Attaches to teeth, must be biocompatible
Teeth Attachment JTS_Teeth_Attachment.stl Biocompatible PETG, IBT Resin Single-use adapter for dental glue
Headpiece (CRA) JTS_Head_Marker.stl Biocompatible PETG, IBT Resin Worn on forehead
Digitizing Pointer (DP) JTS_Calibration_Tool.stl Standard PLA/PETG Uses 2BA thread + dart point for sharp tip

Since 3D printing a sufficiently sharp point is difficult, the digitizing pointer uses a dart point screwed onto a 2BA thread that attaches to the pointer tip. This provides the precision needed for anatomical landmark digitization.

Printing recommendations:

  • Use standard print settings for most components, adjust as needed
  • For the teeth attachment and headpiece, use medical-grade biocompatible materials
  • The FreeCAD project file JTS_Models.FCStd allows customization

Assembly

  1. Digitizing Pointer (DP): Screw the 2BA thread into the pointer tip, then attach the dart point for precise landmark digitization.
  2. Mouthpiece (MTA): Connect the teeth attachment to the mouthpiece. Apply reflective markers (fibers or tape) to the designated marker positions.
  3. Headpiece (CRA): Attach reflective markers to the headpiece. Prepare hook-and-loop tape strips for securing to the forehead.

2. Experimental Setup

Motion Capture System Configuration

The system requires an optical motion capture (OMoCap) system. Our validation used:

  • System: Qualisys Oqus with 3 cameras
  • Sampling rate: 200 Hz
  • Post-calibration accuracy: ~0.6 mm average

Configure your OMoCap system to track the following rigid bodies:

  • MP - Mouthpiece/Mandibular tracking array
  • HP - Headpiece/Cranial reference array
  • CT - Calibration tool/Digitizing pointer

Participant Preparation

  1. Attach the MTA: Apply temporary dental glue to the teeth attachment and press firmly onto the lower incisors. Ensure robust connection without movement.

    Advice: Dry the teeth surface with a clean cloth or air blower before applying glue for better adhesion.

  2. Secure the CRA: Position the headpiece on the forehead and fasten with hook-and-loop tape. It should remain stable during head movements.
  3. Verify tracking: Confirm all rigid bodies are visible and tracked by the OMoCap system.

3. Calibration Procedure

The system requires digitizing six anatomical landmarks on the teeth to define local coordinate systems:

Landmark Positions

Landmark Location Purpose
mand_point_1 Lower left canine/premolar Define mandibular coordinate system
mand_point_2 Lower right canine/premolar Define mandibular coordinate system
mand_point_3 Lower central incisor Define mandibular coordinate system
max_point_1 Upper left canine/premolar Define maxillary coordinate system
max_point_2 Upper right canine/premolar Define maxillary coordinate system
max_point_3 Upper central incisor Define maxillary coordinate system

Digitization Process

  1. Start recording in your OMoCap software
  2. For each landmark:
    • Place the DP tip precisely on the anatomical point
    • Hold steady for ~2-5 seconds
    • Note the frame interval for the configuration file (or record video for later review)
  3. Record the frame intervals for each landmark in your config file under analysis.calibration.mandibular.points and analysis.calibration.maxillary.points

Example configuration (see config/README.md for details):

"mandibular": {
  "rigid_bodies": ["MP", "CT"],
  "points": [
    {"name": "mand_point_1", "frame_interval": [2100, 2600]},
    {"name": "mand_point_2", "frame_interval": [5100, 5600]},
    {"name": "mand_point_3", "frame_interval": [7500, 8000]}
  ]
}

4. Motion Recording

After calibration, record the jaw movements of interest:

  1. Define experiment interval in the config file under analysis.experiment.frame_interval
  2. Instruct participant to perform desired movements:
    • Opening and closing
    • Protrusion and retrusion (forward/backward)
    • Lateral movements (left/right)
    • Cyclic/chewing motions
  3. Stop recording and export data (.mat format for Qualisys)

5. Running the Analysis

As a Script

python -m jts.core path/to/config.json

Optional flags:

  • --verbose for detailed logging
  • --plot to show plots interactively

As a Library

from jts.core import JawMotionAnalysis, ConfigManager

config = ConfigManager.load_config('path/to/config.json')
analysis = JawMotionAnalysis(config)
results = analysis.run_analysis()

Output

The pipeline produces:

  • HDF5 files with transformation matrices and derivatives (velocity, acceleration)
  • Visualizations of 3D trajectories
  • Filtered kinematic data using bidirectional Savitzky-Golay filtering

Real-time Streaming (Experimental)

The system supports real-time data streaming via the streaming configuration, but this mode has not been fully validated. For robust use, the offline workflow is recommended. See config/README.md for streaming configuration options.

Extending the Framework

  • Add new motion capture system support by subclassing MotionCaptureData.
  • Implement new calibration or analysis routines by extending JawMotionAnalysis.
  • Add new visualization or export utilities in helper.py.

Documentation

Examples - Working With the Processed Data

The examples/ directory contains scripts demonstrating key features. For comprehensive documentation, see HDF5 Analysis Guide and Quick Start.

1. Analyze HDF5 Files

Inspect, load, and visualize saved trajectory data:

python examples/hdf5_analysis_example.py output/jaw_motion.h5

2. Split by Sub-Experiments

Extract specific motion types from recordings:

python examples/split_hdf5_example.py jaw_motion.h5 config/config.json

This automatically:

  • Detects frame offset from config (frame_interval)
  • Splits file into sub-experiments (e.g., chewing, opening/closing)
  • Recalculates derivatives for each segment

3. Working with HDF5 Files Programmatically

import jts.helper as hlp
import matplotlib.pyplot as plt

# Inspect file structure
info = hlp.inspect_hdf5('jaw_motion.h5', verbose=True)

# Load transformation data
data = hlp.load_hdf5_transformations('jaw_motion.h5')
transforms = data['T_model_origin_mand_landmark_t']['transformations']  # (N, 4, 4)
derivatives = data['T_model_origin_mand_landmark_t']['derivatives']

# Access derivatives with convenient aliases
trans_vel = derivatives['translational_velocity']      # m/s
ang_vel = derivatives['angular_velocity']              # rad/s

# Visualize trajectory in 3D
hlp.visualize_hdf5_trajectory('jaw_motion.h5', frame_step=50)

# Compare trajectories (translations, rotations, derivatives)
hlp.compare_hdf5_trajectories('jaw_motion.h5', component='translations')
hlp.compare_hdf5_trajectories('jaw_motion.h5', component='translational_velocity')

# Split by sub-experiments
output_files = hlp.split_hdf5_by_sub_experiments(
    'jaw_motion.h5',
    config_file='config/config.json',  # Auto-detects frame_offset
    output_dir='sub_experiments/'
)

plt.show()

Available HDF5 Functions

Function Description
inspect_hdf5() Inspect file structure and metadata
load_hdf5_transformations() Load trajectory data with derivatives
visualize_hdf5_trajectory() Create 3D trajectory visualizations
compare_hdf5_trajectories() Compare trajectories (translations, rotations, derivatives)
split_hdf5_by_sub_experiments() Split files by frame intervals with auto frame offset

๐Ÿ“– For complete API reference and advanced usage, see HDF5 Analysis Documentation

Directory Structure

jaw_tracking_system/
โ”œโ”€โ”€ jts/                              # Core package
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”œโ”€โ”€ calibration_controllers.py   # Calibration point collection
โ”‚   โ”œโ”€โ”€ core.py                      # Main analysis pipeline
โ”‚   โ”œโ”€โ”€ helper.py                    # Utility functions and HDF5 tools
โ”‚   โ”œโ”€โ”€ plotly_visualization.py      # Interactive 3D visualization
โ”‚   โ”œโ”€โ”€ precision_analysis.py        # Precision and accuracy analysis
โ”‚   โ”œโ”€โ”€ qualisys_streaming.py        # Real-time Qualisys streaming
โ”‚   โ”œโ”€โ”€ qualisys.py                  # Qualisys data interface
โ”‚   โ””โ”€โ”€ streaming.py                 # Abstract streaming base classes
โ”œโ”€โ”€ config/
โ”‚   โ”œโ”€โ”€ README.md                    # Configuration guide
โ”‚   โ””โ”€โ”€ config.json                  # Configuration template
โ”œโ”€โ”€ docs/
โ”‚   โ”œโ”€โ”€ HDF5_ANALYSIS.md            # Complete HDF5 API reference
โ”‚   โ””โ”€โ”€ HDF5_QUICKSTART.md          # Quick start guide for HDF5 tools
โ”œโ”€โ”€ examples/
โ”‚   โ”œโ”€โ”€ hdf5_analysis_example.py    # HDF5 inspection and visualization
โ”‚   โ””โ”€โ”€ split_hdf5_example.py       # Split files by sub-experiments
โ”œโ”€โ”€ models/                          # 3D-printable hardware models
โ”‚   โ”œโ”€โ”€ JTS_Calibration_Tool.stl    # Digitizing pointer
โ”‚   โ”œโ”€โ”€ JTS_Head_Marker.stl         # Headpiece with markers
โ”‚   โ”œโ”€โ”€ JTS_Models.FCStd            # FreeCAD project file
โ”‚   โ”œโ”€โ”€ JTS_Mouth_Marker.stl        # Mouthpiece with markers
โ”‚   โ””โ”€โ”€ JTS_Teeth_Attachment.stl    # Teeth attachment
โ”œโ”€โ”€ tests/                           # Test suite (24 tests)
โ”‚   โ”œโ”€โ”€ __init__.py
โ”‚   โ”œโ”€โ”€ test_core.py
โ”‚   โ”œโ”€โ”€ test_helper.py
โ”‚   โ”œโ”€โ”€ test_precision_analysis.py
โ”‚   โ””โ”€โ”€ test_qualisys.py
โ”œโ”€โ”€ CHANGELOG.md
โ”œโ”€โ”€ CITATION.cff
โ”œโ”€โ”€ LICENSE
โ”œโ”€โ”€ MANIFEST.in
โ”œโ”€โ”€ README.md
โ”œโ”€โ”€ requirements.txt
โ””โ”€โ”€ setup.py

Testing

Run the test suite with:

pytest tests

License

This project is only intended for research and educational purposes and is licensed under the Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0). See the LICENSE file for details.

This license allows you to use, adapt, and distribute the material for non-commercial purposes, provided the following conditions are met:

  1. Attribution: You must give appropriate credit to the original authors, provide a link to the license, and indicate if changes were made.
  2. Non-Commercial: You may not use the material for commercial purposes (e.g., selling or profiting from it, directly or indirectly).
  3. ShareAlike: If you create derivative works (e.g., modify or adapt the material), you must distribute them under the same CC BY-NC-SA 4.0 license.
  4. No Additional Restrictions: You may not impose additional legal or technological restrictions that prevent others from exercising the rights granted by the license.

Citation

If you use this package in your research, please cite:

@InProceedings{mueller2025jts,
  title={An Optical Measurement System for Open-Source Tracking of Jaw Motions},
  author={Mรผller, Paul-Otto and Suppelt, Sven and Kupnik, Mario and {von Stryk}, Oskar},
  booktitle = {2025 IEEE Sensors, Vancouver, Canada},
  year={2025},
  publisher = {IEEE},
  doi={10.1109/SENSORS59705.2025.11330651}
}

For more information, see the project website.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

jaw_tracking_system-1.1.1.tar.gz (8.5 MB view details)

Uploaded Source

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

jaw_tracking_system-1.1.1-py3-none-any.whl (98.8 kB view details)

Uploaded Python 3

File details

Details for the file jaw_tracking_system-1.1.1.tar.gz.

File metadata

  • Download URL: jaw_tracking_system-1.1.1.tar.gz
  • Upload date:
  • Size: 8.5 MB
  • Tags: Source
  • Uploaded using Trusted Publishing? Yes
  • Uploaded via: twine/6.1.0 CPython/3.13.7

File hashes

Hashes for jaw_tracking_system-1.1.1.tar.gz
Algorithm Hash digest
SHA256 71704b6cbdaacc4c328e56723a6f23d5a5b944154d94eba8cde278e97cde678f
MD5 5e8266d3293c43e94a00074f6e1d68a8
BLAKE2b-256 d1c9f9c9722126b6789c7bee7bca84e41c87a525891ccc2435af08f48880dc3b

See more details on using hashes here.

Provenance

The following attestation bundles were made for jaw_tracking_system-1.1.1.tar.gz:

Publisher: publish.yml on paulotto/jaw_tracking_system

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

File details

Details for the file jaw_tracking_system-1.1.1-py3-none-any.whl.

File metadata

File hashes

Hashes for jaw_tracking_system-1.1.1-py3-none-any.whl
Algorithm Hash digest
SHA256 7e82c26c1b799fe41940bde7b77eff9fbdefdf6d78b8ea3004ba7c63307eb8a5
MD5 82839eaa2830cf33b4a6d7025298e1d6
BLAKE2b-256 4777aea0127bf91fd1d1753b0dcc2035b002da9ba311b195188db710c6751a1b

See more details on using hashes here.

Provenance

The following attestation bundles were made for jaw_tracking_system-1.1.1-py3-none-any.whl:

Publisher: publish.yml on paulotto/jaw_tracking_system

Attestations: Values shown here reflect the state when the release was signed and may no longer be current.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page