Skip to main content

pnnx is an open standard for PyTorch model interoperability.

Project description

pnnx

python wrapper of pnnx, only support python 3.7+ now.

Install from pip

pnnx is available as wheel packages for macOS, Windows and Linux distributions, you can install with pip:

pip install pnnx

Build & Install from source

Prerequisites

On Unix (Linux, OS X)

  • A compiler with C++14 support
  • CMake >= 3.4

On Mac

  • A compiler with C++14 support
  • CMake >= 3.4

On Windows

  • Visual Studio 2015 or higher
  • CMake >= 3.4

Build & install

  1. clone ncnn.
git clone https://github.com/Tencent/ncnn.git
  1. install pytorch

install pytorch according to https://pytorch.org/ . Anaconda is strongly recommended for example:

conda install pytorch
  1. install
cd /pathto/ncnntools/pnnx/python
python setup.py install

Note: If torchvision and pnnx2onnx are needed, you can set the following environment variables before 'python setup.py install' to enable them. e.g. on ubuntu:

export TORCHVISION_INSTALL_DIR="/project/torchvision"
export PROTOBUF_INCLUDE_DIR="/project/protobuf/include"
export PROTOBUF_LIBRARIES="/project/protobuf/lib64/libprotobuf.a"
export PROTOBUF_PROTOC_EXECUTABLE="/project/protobuf/bin/protoc" 

To do these, you must install Torchvision and Protobuf first.

Tests

cd /pathto/ncnn/tools/pnnx/python
pytest tests

Usage

  1. export model to pnnx
import torch
import torchvision.models as models
import pnnx

net = models.resnet18(pretrained=True)
x = torch.rand(1, 3, 224, 224)

# You could try disabling checking when torch tracing raises error
# opt_net = pnnx.export(net, "resnet18.pt", x, check_trace=False)
opt_net = pnnx.export(net, "resnet18.pt", x)
  1. convert existing model to pnnx
import torch
import pnnx

x = torch.rand(1, 3, 224, 224)
opt_net = pnnx.convert("resnet18.pt", x)

API Reference

  1. pnnx.export

model (torch.nn.Model): model to be exported.

ptpath (str): the torchscript name.

inputs (torch.Tensor of list of torch.Tensor) expected inputs of the model.

inputs2 (torch.Tensor of list of torch.Tensor) alternative inputs of the model. Usually, it is used with input_shapes to resolve dynamic shape.

input_shapes (Optional, list of int or list of list with int type inside) shapes of model inputs. It is used to resolve tensor shapes in model graph. for example, [1,3,224,224] for the model with only 1 input, [[1,3,224,224],[1,3,224,224]] for the model that have 2 inputs.

input_types (Optional, str or list of str) types of model inputs, it should have the same length with input_shapes. for example, "f32" for the model with only 1 input, ["f32", "f32"] for the model that have 2 inputs.

typename torch type
f32 torch.float32 or torch.float
f64 torch.float64 or torch.double
f16 torch.float16 or torch.half
u8 torch.uint8
i8 torch.int8
i16 torch.int16 or torch.short
i32 torch.int32 or torch.int
i64 torch.int64 or torch.long
c32 torch.complex32
c64 torch.complex64
c128 torch.complex128

input_shapes2 (Optional, list of int or list of list with int type inside) shapes of alternative model inputs, the format is identical to input_shapes. Usually, it is used with input_shapes to resolve dynamic shape (-1) in model graph.

input_types2 (Optional, str or list of str) types of alternative model inputs.

device (Optional, str, default="cpu") device type for the input in TorchScript model, cpu or gpu.

customop (Optional, str or list of str) list of Torch extensions (dynamic library) for custom operators. For example, "/home/nihui/.cache/torch_extensions/fused/fused.so" or ["/home/nihui/.cache/torch_extensions/fused/fused.so",...].

moduleop (Optional, str or list of str) list of modules to keep as one big operator. for example, "models.common.Focus" or ["models.common.Focus","models.yolo.Detect"].

optlevel (Optional, int, default=2) graph optimization level

option optimization level
0 do not apply optimization
1 do not apply optimization
2 optimization more for inference

pnnxparam (Optional, str, default="*.pnnx.param", * is the model name): PNNX graph definition file.

pnnxbin (Optional, str, default="*.pnnx.bin"): PNNX model weight.

pnnxpy (Optional, str, default="*_pnnx.py"): PyTorch script for inference, including model construction and weight initialization code.

pnnxonnx (Optional, str, default="*.pnnx.onnx"): PNNX model in onnx format.

ncnnparam (Optional, str, default="*.ncnn.param"): ncnn graph definition.

ncnnbin (Optional, str, default="*.ncnn.bin"): ncnn model weight.

ncnnpy (Optional, str, default="*_ncnn.py"): pyncnn script for inference.

  1. pnnx.convert

ptpath (str): torchscript model to be converted.

Other parameters are consistent with pnnx.export

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

pnnx-20240715-py3-none-win_amd64.whl (16.4 MB view details)

Uploaded Python 3 Windows x86-64

pnnx-20240715-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (22.0 MB view details)

Uploaded Python 3 manylinux: glibc 2.17+ x86-64

pnnx-20240715-py3-none-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (19.9 MB view details)

Uploaded Python 3 manylinux: glibc 2.17+ ARM64

pnnx-20240715-py3-none-macosx_10_9_universal2.macosx_10_9_x86_64.macosx_11_0_arm64.whl (43.7 MB view details)

Uploaded Python 3 macOS 10.9+ universal2 (ARM64, x86-64) macOS 10.9+ x86-64 macOS 11.0+ ARM64

File details

Details for the file pnnx-20240715-py3-none-win_amd64.whl.

File metadata

  • Download URL: pnnx-20240715-py3-none-win_amd64.whl
  • Upload date:
  • Size: 16.4 MB
  • Tags: Python 3, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.1.0 CPython/3.12.4

File hashes

Hashes for pnnx-20240715-py3-none-win_amd64.whl
Algorithm Hash digest
SHA256 814f4071d96ca64a6a48cdbc426d502c553e448185673636bff059a1b27f68ce
MD5 9acb875f1f35d37c34f60a5106771018
BLAKE2b-256 bf6be66a2e58f82f2d7fd4fb0b037d18218ba8c02047c4c2f9fdaa3c915af474

See more details on using hashes here.

File details

Details for the file pnnx-20240715-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for pnnx-20240715-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 011b9c9f107dc94bdb0b4c3f97b0015e0b04baec52df3d414dbb30d2d5631d01
MD5 e882803e50f0092b19d2282afb88a82a
BLAKE2b-256 e24cfa150cee7f4a0a849ae910f37a9fe9b3fe8875f44a5d30d20073ee723856

See more details on using hashes here.

File details

Details for the file pnnx-20240715-py3-none-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for pnnx-20240715-py3-none-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 5c159217235be9f8507e9050277756b7687714f266d0b0c82418542fcb18c709
MD5 832fd42eeb4e85190e15570ca6f7ddf4
BLAKE2b-256 26770fb606c01057e65ef5e83bada19081a4dec9d819f4239c07ffecd72bd69a

See more details on using hashes here.

File details

Details for the file pnnx-20240715-py3-none-macosx_10_9_universal2.macosx_10_9_x86_64.macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pnnx-20240715-py3-none-macosx_10_9_universal2.macosx_10_9_x86_64.macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 866546e8e8ee58e08a2b974b6948523dc4fe14b8433ad20c029cfb2f20e4aa18
MD5 7d95c05797db504f538ac5cb54df8746
BLAKE2b-256 ec7cf4a27dde2671a3f9bb26162a0366ba1bc603c300caad9aafaf6339ff9126

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page