Skip to main content

TODO

Project description

Aidge logo

EPL 2.0 Examples PyPi Examples Documentation Status GitLab Contributors Open GitLab Issues Closed GitLab Issues

Aidge Export: TensorRT

The aim of this module is to provide an export to TensorRT SDK via the Aidge framework.

Quick start

System Requirements

  • python >= 3.10
  • aidge_core
pip install aidge-export-cpp

To compile the export, you need one of the following:

  • Docker installed (recommended). The Docker image includes TensorRT, so no additional installation is required.
  • TensorRT 8.6 or 10.10 installed on your host machine (required only if compiling directly on the host without Docker).

🛠 Build from Source

Prerequisite (in addition to previous one):

1. Python installation using setup scripts

Environment Python Development
Windows .\setup.ps1 -Modules export_cpp -Tests
Unix ./setup.sh -m export_cpp --tests

[!TIP] Use Get-Help setup.ps1 (Windows) or ./setup.sh -h (Unix) for full documentation.

2. Python Installation using pip

Run these commands from the aidge_export_cpp/ directory:

#fStandard install
pip install . -v

# Install with testing dependencies
pip install .[test] -v && pytest

Editable Install (Experimental)

Use this for real-time development without re-installing.

pip install --no-build-isolation -ve . --config-settings=editable.rebuild=true -Cbuild-dir=build

Usage

aidge_export_tensorrt module exposes its main python API using export() function.

import aidge_export_tensorrt
aidge_export_tensorrt.export("export_trt", "model.onnx")

The export provides a Makefile with several options to utilize the export on your machine. You can generate either a C++ export or a Python export.
Additionally, you have the option to compile the export and/or the Python library using Docker if your host machine lacks the necessary packages.

The available commands are summarized in the following table:

Command Description
make / make help Display the different options available
make build_cpp Compile the export on host for C++ apps (generate an executable in build/bin)
make build_lib_python Compile the export on host for Python apps (generate a python lib in build/lib)
make build_image_docker Generate the docker image of the tensorrt compiler
make build_cpp_docker Compile the export in a container for C++ apps (generate an executable in build/bin)
make test_cpp_docker Test the executable for C++ apps in a container
make build_lib_python_docker Compile the export in a container for Python apps (generate a python lib in build/lib)
make test_lib_python_docker Test the lib for Python apps in a container
make clean Clean up the build and bin folders

Here's an example to compile and test the export Python library using Docker:

cd export_trt/ 
make build_lib_python_docker
make test_lib_python_docker

This will execute the test.py file within the Docker container, initializing and profiling the selected model.

Known issue

Export side

Issue related to the usage of the TensorRT export.

No CMAKE_CUDA_COMPILER could be found

CMake Error at CMakeLists.txt:21 (enable_language):
  No CMAKE_CUDA_COMPILER could be found.

  Tell CMake where to find the compiler by setting either the environment
  variable "CUDACXX" or the CMake cache entry CMAKE_CUDA_COMPILER to the full
  path to the compiler, or to the compiler name if it is in the PATH.

This error occur when you try to compile your project without having NVCC to your PATH.

To fix this, add nvcc to the path:

export PATH=<NVCC_PATH>:$PATH;

Where <NVCC_PATH> is the path to the nvcc compiler. For recent ORIN nvcc is installed at: /usr/local/cuda/bin.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

If you're not sure about the file name format, learn more about wheel file names.

aidge_export_tensorrt-0.9.1.post2-py3-none-any.whl (93.7 kB view details)

Uploaded Python 3

File details

Details for the file aidge_export_tensorrt-0.9.1.post2-py3-none-any.whl.

File metadata

File hashes

Hashes for aidge_export_tensorrt-0.9.1.post2-py3-none-any.whl
Algorithm Hash digest
SHA256 e723f0b01655fead110dd6500ab30f81911ae4d7ab4ab57280a74f7131eb071a
MD5 e3443dbb2fffc9169c3f06f08f2456dc
BLAKE2b-256 d001e7c33b084db80986cb05adb28a067620ddceaf3790630ac306b5d0bce329

See more details on using hashes here.

Supported by

AWS Cloud computing and Security Sponsor Datadog Monitoring Depot Continuous Integration Fastly CDN Google Download Analytics Pingdom Monitoring Sentry Error logging StatusPage Status page