Skip to main content

Lovely Tensors

Project description

Lovely Tensors

Install

pip install lovely-tensors

How to use

How often do you find yourself debuggin a PyTorch code? You dump a tensor to the cell output, and see this:

numbers
tensor[3, 196, 196] n=115248 x∈[-2.118, 2.640] μ=-0.388 σ=1.073

Was it really useful?

What is the shape?
What are the statistics?
Are any of the values nan or inf?
Is it an image of a man holding a tench?

import lovely_tensors.tensors as lt
lt.PRINT_OPTS.color=True
lt.monkey_patch()

__repr__()

# A very short tensor - no min/max
numbers.flatten()[:2]
tensor[2] μ=-0.345 σ=0.012 [-0.354, -0.337]
# A slightly longer one
numbers.flatten()[:6].view(2,3)
tensor[2, 3] n=6 x∈[-0.440, -0.337] μ=-0.388 σ=0.038 [[-0.354, -0.337, -0.405], [-0.440, -0.388, -0.405]]
# Too long to show the values
numbers
tensor[3, 196, 196] n=115248 x∈[-2.118, 2.640] μ=-0.388 σ=1.073
spicy = numbers.flatten()[:12].clone()

spicy[0] *= 10000
spicy[1] /= 10000
spicy[2] = float('inf')
spicy[3] = float('-inf')
spicy[4] = float('nan')

spicy = spicy.reshape((2,6))
spicy
tensor[2, 6] n=12 x∈[-3.541e+03, -3.369e-05] μ=-393.776 σ=1.180e+03 +inf! -inf! nan!
# A zero tensor
torch.zeros(10, 10)
tensor[10, 10] n=100 all_zeros
spicy.verbose
tensor[2, 6] n=12 x∈[-3.541e+03, -3.369e-05] μ=-393.776 σ=1.180e+03 +inf! -inf! nan!
[[-3.5405e+03, -3.3693e-05,         inf,        -inf,         nan, -4.0543e-01],
 [-4.2255e-01, -4.9105e-01, -5.0818e-01, -5.5955e-01, -5.4243e-01, -5.0818e-01]]
spicy.plain
[[-3.5405e+03, -3.3693e-05,         inf,        -inf,         nan, -4.0543e-01],
 [-4.2255e-01, -4.9105e-01, -5.0818e-01, -5.5955e-01, -5.4243e-01, -5.0818e-01]]

Going .deeper

numbers.deeper
tensor[3, 196, 196] n=115248 x∈[-2.118, 2.640] μ=-0.388 σ=1.073
  tensor[196, 196] n=38416 x∈[-2.118, 2.249] μ=-0.324 σ=1.036
  tensor[196, 196] n=38416 x∈[-1.966, 2.429] μ=-0.274 σ=0.973
  tensor[196, 196] n=38416 x∈[-1.804, 2.640] μ=-0.567 σ=1.178
# You can go deeper if you need to
dt = torch.randn(3, 3, 5)
dt.deeper(2)
tensor[3, 3, 5] n=45 x∈[-2.201, 1.498] μ=-0.151 σ=0.927
  tensor[3, 5] n=15 x∈[-1.864, 1.498] μ=0.176 σ=0.973
    tensor[5] x∈[-0.805, 1.408] μ=0.470 σ=0.917 [0.092, 0.346, 1.308, -0.805, 1.408]
    tensor[5] x∈[-0.678, 0.909] μ=0.358 σ=0.689 [0.909, -0.024, 0.818, 0.765, -0.678]
    tensor[5] x∈[-1.864, 1.498] μ=-0.299 σ=1.252 [0.239, 1.498, -1.864, -0.640, -0.726]
  tensor[3, 5] n=15 x∈[-2.201, 1.155] μ=-0.421 σ=0.954
    tensor[5] x∈[-0.667, 1.155] μ=0.201 σ=0.732 [-0.391, 0.468, -0.667, 1.155, 0.441]
    tensor[5] x∈[-2.201, -0.012] μ=-1.358 σ=0.844 [-1.926, -1.329, -2.201, -0.012, -1.318]
    tensor[5] x∈[-0.503, 0.723] μ=-0.108 σ=0.488 [-0.334, -0.503, -0.341, 0.723, -0.083]
  tensor[3, 5] n=15 x∈[-1.292, 1.120] μ=-0.208 σ=0.803
    tensor[5] x∈[-1.292, 0.886] μ=-0.284 σ=0.951 [-1.292, 0.886, 0.146, 0.082, -1.242]
    tensor[5] x∈[-1.160, 1.120] μ=-0.130 σ=0.924 [-1.160, 0.359, -0.873, 1.120, -0.095]
    tensor[5] x∈[-0.982, 0.572] μ=-0.211 σ=0.694 [0.572, 0.407, -0.982, -0.261, -0.790]

Now in .rgb colour

The important queston - is it our man?

numbers.rgb

Maaaaybe? Looks like someone normalized him.

in_stats = { "mean": (0.485, 0.456, 0.406),
             "std": (0.229, 0.224, 0.225) }
numbers.rgb(in_stats)

It’s indeed our hero, the Tenchman!

.plt the statistics

(numbers+3).plt

(numbers+3).plt(center="mean")

(numbers+3).plt(center="range")

Without .monkey_patch

lt.lovely(spicy)
tensor[2, 6] n=12 x∈[-3.541e+03, -3.369e-05] μ=-393.776 σ=1.180e+03 +inf! -inf! nan!
lt.lovely(spicy, verbose=True)
tensor[2, 6] n=12 x∈[-3.541e+03, -3.369e-05] μ=-393.776 σ=1.180e+03 +inf! -inf! nan!
[[-3.5405e+03, -3.3693e-05,         inf,        -inf,         nan, -4.0543e-01],
 [-4.2255e-01, -4.9105e-01, -5.0818e-01, -5.5955e-01, -5.4243e-01, -5.0818e-01]]
lt.lovely(numbers, depth=1)
tensor[3, 196, 196] n=115248 x∈[-2.118, 2.640] μ=-0.388 σ=1.073
  tensor[196, 196] n=38416 x∈[-2.118, 2.249] μ=-0.324 σ=1.036
  tensor[196, 196] n=38416 x∈[-1.966, 2.429] μ=-0.274 σ=0.973
  tensor[196, 196] n=38416 x∈[-1.804, 2.640] μ=-0.567 σ=1.178
lt.rgb(numbers, in_stats)

lt.plot(numbers, center="mean")

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lovely-tensors-0.0.6.tar.gz (15.9 kB view details)

Uploaded Source

Built Distribution

lovely_tensors-0.0.6-py3-none-any.whl (16.8 kB view details)

Uploaded Python 3

File details

Details for the file lovely-tensors-0.0.6.tar.gz.

File metadata

  • Download URL: lovely-tensors-0.0.6.tar.gz
  • Upload date:
  • Size: 15.9 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.6

File hashes

Hashes for lovely-tensors-0.0.6.tar.gz
Algorithm Hash digest
SHA256 11204c6050a4b2eeb3fc75255edc2543ed2b646cc577b26a108cf160877c038e
MD5 33feb226b982ca48d027e3534888d28c
BLAKE2b-256 4a97fee05302b2c0f863f6fe5d19a39b46bbf46c1e37fc0c81f86b886cdde682

See more details on using hashes here.

File details

Details for the file lovely_tensors-0.0.6-py3-none-any.whl.

File metadata

File hashes

Hashes for lovely_tensors-0.0.6-py3-none-any.whl
Algorithm Hash digest
SHA256 ba671b6f750314b520a9a5603a3d239e3c369cdd4e9cf97d2b1d4a1db3e79b36
MD5 642fb72e108628b8f3e855b0b61aeec3
BLAKE2b-256 db7dad4d1ec72f7bdfc177501cee1f9be40101fa123207effdb8ee9bed738f5a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page