Skip to main content

pnnx is an open standard for PyTorch model interoperability.

Project description

pnnx

python wrapper of pnnx, only support python 3.7+ now.

Install from pip

pnnx is available as wheel packages for macOS, Windows and Linux distributions, you can install with pip:

pip install pnnx

Build & Install from source

Prerequisites

On Unix (Linux, OS X)

  • A compiler with C++14 support
  • CMake >= 3.4

On Mac

  • A compiler with C++14 support
  • CMake >= 3.4

On Windows

  • Visual Studio 2015 or higher
  • CMake >= 3.4

Build & install

  1. clone ncnn.
git clone https://github.com/Tencent/ncnn.git
  1. install pytorch

install pytorch according to https://pytorch.org/ . Anaconda is strongly recommended for example:

conda install pytorch
  1. install
cd /pathto/ncnntools/pnnx/python
python setup.py install

Note: If torchvision and pnnx2onnx are needed, you can set the following environment variables before 'python setup.py install' to enable them. e.g. on ubuntu:

export TORCHVISION_INSTALL_DIR="/project/torchvision"
export PROTOBUF_INCLUDE_DIR="/project/protobuf/include"
export PROTOBUF_LIBRARIES="/project/protobuf/lib64/libprotobuf.a"
export PROTOBUF_PROTOC_EXECUTABLE="/project/protobuf/bin/protoc" 

To do these, you must install Torchvision and Protobuf first.

Tests

cd /pathto/ncnn/tools/pnnx/python
pytest tests

Usage

  1. export model to pnnx
import torch
import torchvision.models as models
import pnnx

net = models.resnet18(pretrained=True)
x = torch.rand(1, 3, 224, 224)

# You could try disabling checking when torch tracing raises error
# opt_net = pnnx.export(net, "resnet18.pt", x, check_trace=False)
opt_net = pnnx.export(net, "resnet18.pt", x)
  1. convert existing model to pnnx
import torch
import pnnx

x = torch.rand(1, 3, 224, 224)
opt_net = pnnx.convert("resnet18.pt", x)

API Reference

  1. pnnx.export

model (torch.nn.Model): model to be exported.

ptpath (str): the torchscript name.

inputs (torch.Tensor of list of torch.Tensor) expected inputs of the model.

inputs2 (torch.Tensor of list of torch.Tensor) alternative inputs of the model. Usually, it is used with input_shapes to resolve dynamic shape.

input_shapes (Optional, list of int or list of list with int type inside) shapes of model inputs. It is used to resolve tensor shapes in model graph. for example, [1,3,224,224] for the model with only 1 input, [[1,3,224,224],[1,3,224,224]] for the model that have 2 inputs.

input_types (Optional, str or list of str) types of model inputs, it should have the same length with input_shapes. for example, "f32" for the model with only 1 input, ["f32", "f32"] for the model that have 2 inputs.

typename torch type
f32 torch.float32 or torch.float
f64 torch.float64 or torch.double
f16 torch.float16 or torch.half
u8 torch.uint8
i8 torch.int8
i16 torch.int16 or torch.short
i32 torch.int32 or torch.int
i64 torch.int64 or torch.long
c32 torch.complex32
c64 torch.complex64
c128 torch.complex128

input_shapes2 (Optional, list of int or list of list with int type inside) shapes of alternative model inputs, the format is identical to input_shapes. Usually, it is used with input_shapes to resolve dynamic shape (-1) in model graph.

input_types2 (Optional, str or list of str) types of alternative model inputs.

device (Optional, str, default="cpu") device type for the input in TorchScript model, cpu or gpu.

customop (Optional, str or list of str) list of Torch extensions (dynamic library) for custom operators. For example, "/home/nihui/.cache/torch_extensions/fused/fused.so" or ["/home/nihui/.cache/torch_extensions/fused/fused.so",...].

moduleop (Optional, str or list of str) list of modules to keep as one big operator. for example, "models.common.Focus" or ["models.common.Focus","models.yolo.Detect"].

optlevel (Optional, int, default=2) graph optimization level

option optimization level
0 do not apply optimization
1 do not apply optimization
2 optimization more for inference

pnnxparam (Optional, str, default="*.pnnx.param", * is the model name): PNNX graph definition file.

pnnxbin (Optional, str, default="*.pnnx.bin"): PNNX model weight.

pnnxpy (Optional, str, default="*_pnnx.py"): PyTorch script for inference, including model construction and weight initialization code.

pnnxonnx (Optional, str, default="*.pnnx.onnx"): PNNX model in onnx format.

ncnnparam (Optional, str, default="*.ncnn.param"): ncnn graph definition.

ncnnbin (Optional, str, default="*.ncnn.bin"): ncnn model weight.

ncnnpy (Optional, str, default="*_ncnn.py"): pyncnn script for inference.

  1. pnnx.convert

ptpath (str): torchscript model to be converted.

Other parameters are consistent with pnnx.export

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distributions

pnnx-20240410-py3-none-win_amd64.whl (13.0 MB view details)

Uploaded Python 3 Windows x86-64

pnnx-20240410-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl (18.3 MB view details)

Uploaded Python 3 manylinux: glibc 2.17+ x86-64

pnnx-20240410-py3-none-manylinux2014_aarch64.manylinux_2_17_aarch64.whl (16.8 MB view details)

Uploaded Python 3 manylinux: glibc 2.17+ ARM64

pnnx-20240410-py3-none-macosx_10_9_universal2.macosx_10_9_x86_64.macosx_11_0_arm64.whl (36.6 MB view details)

Uploaded Python 3 macOS 10.9+ universal2 (ARM64, x86-64) macOS 10.9+ x86-64 macOS 11.0+ ARM64

File details

Details for the file pnnx-20240410-py3-none-win_amd64.whl.

File metadata

  • Download URL: pnnx-20240410-py3-none-win_amd64.whl
  • Upload date:
  • Size: 13.0 MB
  • Tags: Python 3, Windows x86-64
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/5.0.0 CPython/3.12.3

File hashes

Hashes for pnnx-20240410-py3-none-win_amd64.whl
Algorithm Hash digest
SHA256 8cda860d365b3c2eb05c38a0277696ccb91e1b7e03423bb11501cce052a34f86
MD5 ad2c523536d6a1646e28f77a75d2ce70
BLAKE2b-256 c9adf656856947309c0f59a2215a8b383569a2cd98f6acf6b1bf35916edddc28

See more details on using hashes here.

File details

Details for the file pnnx-20240410-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl.

File metadata

File hashes

Hashes for pnnx-20240410-py3-none-manylinux2014_x86_64.manylinux_2_17_x86_64.whl
Algorithm Hash digest
SHA256 98fbeb2942f3b05f72d2a3cdba5cc5277a1dd5008912001608b2a20e2e6a18e0
MD5 8e14a6d72b8cc1d04a8532ff25ff6daf
BLAKE2b-256 5c598b01afd38a81cd59e744903ff46836aa91e52e6b559b7e12791d65697836

See more details on using hashes here.

File details

Details for the file pnnx-20240410-py3-none-manylinux2014_aarch64.manylinux_2_17_aarch64.whl.

File metadata

File hashes

Hashes for pnnx-20240410-py3-none-manylinux2014_aarch64.manylinux_2_17_aarch64.whl
Algorithm Hash digest
SHA256 3a73c1bc266a0a36134b31bd46e48dea4f986f5fd22f1e467c325d7bdf6c1439
MD5 d013485f40c5d4e8acb3eaa9d9d51705
BLAKE2b-256 62ce9e9d6da18bf74b61a05a8452cbfabd86e14652242bc0cb9f322f7644ee6e

See more details on using hashes here.

File details

Details for the file pnnx-20240410-py3-none-macosx_10_9_universal2.macosx_10_9_x86_64.macosx_11_0_arm64.whl.

File metadata

File hashes

Hashes for pnnx-20240410-py3-none-macosx_10_9_universal2.macosx_10_9_x86_64.macosx_11_0_arm64.whl
Algorithm Hash digest
SHA256 efd7e7c07faea8e63015498883e76b2b014fa6975b90bfbf852417bcde4ccf5c
MD5 0ddd0a07407c5a34e145adb3f57236c2
BLAKE2b-256 031a25b579a1d9d6b7bfb47c868ba2054385ff7080f07fd8a5df80e4e07e65f4

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page