Skip to main content

This package is based on pytorch and try to provide a more user-friendly interface for pytorch

Project description

torchplus

Introduction

torchplus is a package affiliated to project PyCTLib. We encapsulated a new type on top of pytorch tensers, which we call it torchplus.Tensor. It has the same function as torch.Tensor, but it can automatically select the device it was on and provide batch or channel dimensions. Also, we try to provide more useful module for torch users to make deep learning to be implemented more easily. It relies python v3.6+ with torch v 1.7.0+. Note that torch v1.7.0 was released in 2020, and it is necessary for this package as the inheritance behavior for this version is different from previous versions. All original torch functions can be used for torchplus tensors.

Special features for torchplus are still under development. If unknown errors pop our, please use traditional torch code to bypass it and meanwhile it would be very kind of you to let us know if anything is needed: please contact us by e-mail.

>>> import torchplus as tp
>>> import torch.nn as nn
>>> tp.set_autodevice(False)
>>> tp.manual_seed(0)
>>> t = tp.randn([3000], 400, requires_grad=True)
>>> LP = nn.Linear(400, 400)
>>> a = LP(t)
>>> a.sum().backward()
>>> print(t.grad)
Tensor([[-0.2986,  0.0267,  0.9059,  ...,  0.4563, -0.1291,  0.5702],
        [-0.2986,  0.0267,  0.9059,  ...,  0.4563, -0.1291,  0.5702],
        [-0.2986,  0.0267,  0.9059,  ...,  0.4563, -0.1291,  0.5702],
        ...,
        [-0.2986,  0.0267,  0.9059,  ...,  0.4563, -0.1291,  0.5702],
        [-0.2986,  0.0267,  0.9059,  ...,  0.4563, -0.1291,  0.5702],
        [-0.2986,  0.0267,  0.9059,  ...,  0.4563, -0.1291,  0.5702]], shape=torchplus.Size([3000], 400))

torchplus has all of following appealing features:

  1. Auto assign the tensors to available GPU device by default.
  2. Use [nbatch] or {nchannel} to specify the batch and channel dimensions. i.e. tp.rand([4], {2}, 20, 30) returns a tensor of $20 imes30$ matrices of channel 2 with batch size 4. One may also use tensor.batch_dimension to access to batch dimension, channel dimension can be operated likewise.
  3. Batch and channel dimension can help auto matching the sizes of two tensors in operations. For example, tensors of sizes (3, [2], 4) and (3, 4) can be automatically added together with axis of size 3 and 4 matched together. Some methods will also use this information. Sampling, for example, will take the batch dimension as priority.
  4. The tensor object is compatible with all torch functions.

Installation

This package can be installed by pip install torchplus or moving the source code to the directory of python libraries (the source code can be downloaded on github or PyPI).

pip install torchplus

Usages

Not available yet, one may check the codes for usages.

Acknowledgment

@Yiteng Zhang, Yuncheng Zhou: Developers

Project details


Release history Release notifications | RSS feed

Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

torchplus-0.2.23.tar.gz (240.5 kB view details)

Uploaded Source

File details

Details for the file torchplus-0.2.23.tar.gz.

File metadata

  • Download URL: torchplus-0.2.23.tar.gz
  • Upload date:
  • Size: 240.5 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.3.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/49.2.0.post20200714 requests-toolbelt/0.9.1 tqdm/4.47.0 CPython/3.8.3

File hashes

Hashes for torchplus-0.2.23.tar.gz
Algorithm Hash digest
SHA256 64bbaac0e46360368d52a7a8e39f14ca15da5dc200e045df999f75220f553b09
MD5 b5627fb6ec3a544d27297f9d3a186de3
BLAKE2b-256 7aac49f8f65f0702db3f0bad0893b4c0b537b8f5f53185267c6fc1d42793da22

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page