Skip to main content

Lovely Tensors

Project description

Lovely Tensors

Install

pip install lovely-tensors

How to use

How often do you find yourself debuggin a neural network? You dump a tensor to the cell output, and see this:

numbers
tensor([[[-0.3541, -0.3369, -0.4054,  ..., -0.5596, -0.4739,  2.2489],
         [-0.4054, -0.4226, -0.4911,  ..., -0.9192, -0.8507,  2.1633],
         [-0.4739, -0.4739, -0.5424,  ..., -1.0390, -1.0390,  2.1975],
         ...,
         [-0.9020, -0.8335, -0.9363,  ..., -1.4672, -1.2959,  2.2318],
         [-0.8507, -0.7822, -0.9363,  ..., -1.6042, -1.5014,  2.1804],
         [-0.8335, -0.8164, -0.9705,  ..., -1.6555, -1.5528,  2.1119]],

        [[-0.1975, -0.1975, -0.3025,  ..., -0.4776, -0.3725,  2.4111],
         [-0.2500, -0.2325, -0.3375,  ..., -0.7052, -0.6702,  2.3585],
         [-0.3025, -0.2850, -0.3901,  ..., -0.7402, -0.8102,  2.3761],
         ...,
         [-0.4251, -0.2325, -0.3725,  ..., -1.0903, -1.0203,  2.4286],
         [-0.3901, -0.2325, -0.4251,  ..., -1.2304, -1.2304,  2.4111],
         [-0.4076, -0.2850, -0.4776,  ..., -1.2829, -1.2829,  2.3410]],

        [[-0.6715, -0.9853, -0.8807,  ..., -0.9678, -0.6890,  2.3960],
         [-0.7238, -1.0724, -0.9678,  ..., -1.2467, -1.0201,  2.3263],
         [-0.8284, -1.1247, -1.0201,  ..., -1.2641, -1.1596,  2.3786],
         ...,
         [-1.2293, -1.4733, -1.3861,  ..., -1.5081, -1.2641,  2.5180],
         [-1.1944, -1.4559, -1.4210,  ..., -1.6476, -1.4733,  2.4308],
         [-1.2293, -1.5256, -1.5081,  ..., -1.6824, -1.5256,  2.3611]]])

Was it really useful?

What is the shape?
What are the statistics?
Are any of the values nan or inf?
Is it an image of a man holding a tench?

import lovely_tensors.tensors as lt
lt.PRINT_OPTS.color=True
# A very short tensor - no min/max
print(lt.lovely(numbers.view(-1)[:2]))
# A slightly longer tensor
print(lt.lovely(numbers.view(-1)[:6].view(2,3)))
tensor[2] μ=-0.345 σ=0.012 x=[-0.354, -0.337]
tensor[2, 3] n=6 x∈[-0.440, -0.337] μ=-0.388 σ=0.038 x=[[-0.354, -0.337, -0.405], [-0.440, -0.388, -0.405]]
lt.lovely(numbers)
tensor[3, 196, 196] n=115248 x∈[-2.118, 2.640] μ=-0.388 σ=1.073
lt.lovely(numbers, depth=1)
tensor[3, 196, 196] n=115248 x∈[-2.118, 2.640] μ=-0.388 σ=1.073
  tensor[196, 196] n=38416 x∈[-2.118, 2.249] μ=-0.324 σ=1.036
  tensor[196, 196] n=38416 x∈[-1.966, 2.429] μ=-0.274 σ=0.973
  tensor[196, 196] n=38416 x∈[-1.804, 2.640] μ=-0.567 σ=1.178
t = numbers.view(-1)[:12].clone()

t[0] *= 10000
t[1] /= 10000
t[2] = float('inf')
t[3] = float('-inf')
t[4] = float('nan')
t = t.reshape((2,6))
# A spicy tensor
lt.lovely(t)
tensor[2, 6] n=12 x∈[-3.541e+03, -3.369e-05] μ=-393.776 σ=1.180e+03 +inf! -inf! nan!
# A zero tensor
lt.lovely(torch.zeros(10, 10))
tensor[10, 10] n=100 all_zeros

Now the important queston - is it our man?

lt.rgb(numbers)

Maaaaybe? Looks like someone normalized him.

in_stats = { "mean": (0.485, 0.456, 0.406),
             "std": (0.229, 0.224, 0.225) }
lt.rgb(numbers, in_stats)

There can be no doubt, it’s out hero the Tenchman!

One last thing - let’s monkey-patch torch.Tensor for convenience.

lt.monkey_patch()
t
tensor[2, 6] n=12 x∈[-3.541e+03, -3.369e-05] μ=-393.776 σ=1.180e+03 +inf! -inf! nan!
t.verbose
tensor[2, 6] n=12 x∈[-3.541e+03, -3.369e-05] μ=-393.776 σ=1.180e+03 +inf! -inf! nan!
x=[[-3.5405e+03, -3.3693e-05,         inf,        -inf,         nan, -4.0543e-01],
   [-4.2255e-01, -4.9105e-01, -5.0818e-01, -5.5955e-01, -5.4243e-01, -5.0818e-01]]
t.plain
[[-3.5405e+03, -3.3693e-05,         inf,        -inf,         nan, -4.0543e-01],
 [-4.2255e-01, -4.9105e-01, -5.0818e-01, -5.5955e-01, -5.4243e-01, -5.0818e-01]]
numbers.rgb
# you can also do numbers.rgb()

#per-channel stats
numbers.deeper
tensor[3, 196, 196] n=115248 x∈[-2.118, 2.640] μ=-0.388 σ=1.073
  tensor[196, 196] n=38416 x∈[-2.118, 2.249] μ=-0.324 σ=1.036
  tensor[196, 196] n=38416 x∈[-1.966, 2.429] μ=-0.274 σ=0.973
  tensor[196, 196] n=38416 x∈[-1.804, 2.640] μ=-0.567 σ=1.178
# You can go even deeper if you want to
dt = torch.randn(3, 3, 5)
dt.deeper(2)
tensor[3, 3, 5] n=45 x∈[-2.541, 2.153] μ=0.110 σ=0.988
  tensor[3, 5] n=15 x∈[-1.115, 2.153] μ=0.161 σ=0.967
    tensor[5] x∈[-1.115, 1.054] μ=0.217 σ=0.828 x=[0.612, 1.054, 0.501, 0.033, -1.115]
    tensor[5] x∈[-0.765, 2.153] μ=0.544 σ=1.264 x=[-0.765, 0.750, -0.688, 2.153, 1.271]
    tensor[5] x∈[-1.057, 0.930] μ=-0.277 σ=0.747 x=[-0.675, -0.239, 0.930, -1.057, -0.345]
  tensor[3, 5] n=15 x∈[-2.541, 2.136] μ=-0.056 σ=1.276
    tensor[5] x∈[-1.441, 1.657] μ=0.163 σ=1.137 x=[-1.441, -0.255, 0.584, 0.269, 1.657]
    tensor[5] x∈[-2.541, 2.136] μ=0.312 σ=1.894 x=[2.136, -0.509, 0.737, 1.735, -2.541]
    tensor[5] x∈[-1.273, -0.266] μ=-0.643 σ=0.400 x=[-0.646, -1.273, -0.331, -0.266, -0.700]
  tensor[3, 5] n=15 x∈[-1.279, 1.296] μ=0.225 σ=0.677
    tensor[5] x∈[-0.400, 1.296] μ=0.513 σ=0.614 x=[-0.400, 0.412, 0.738, 0.520, 1.296]
    tensor[5] x∈[-0.833, 0.788] μ=0.093 σ=0.605 x=[0.267, 0.788, -0.094, 0.339, -0.833]
    tensor[5] x∈[-1.279, 0.905] μ=0.069 σ=0.839 x=[-0.054, 0.905, 0.592, -1.279, 0.181]
# A quick de-norm. Don't worry, the data stays the same.
numbers.rgb(in_stats)

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

lovely-tensors-0.0.4.tar.gz (12.4 kB view details)

Uploaded Source

Built Distribution

lovely_tensors-0.0.4-py3-none-any.whl (14.6 kB view details)

Uploaded Python 3

File details

Details for the file lovely-tensors-0.0.4.tar.gz.

File metadata

  • Download URL: lovely-tensors-0.0.4.tar.gz
  • Upload date:
  • Size: 12.4 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/4.0.1 CPython/3.10.6

File hashes

Hashes for lovely-tensors-0.0.4.tar.gz
Algorithm Hash digest
SHA256 2faa93d6a13c29dabbf00aac7a1ba6a52f1f0c21dbc59ebc0564b559df937774
MD5 08a782dead44c83618c3afb015e59334
BLAKE2b-256 28902ce4945582ef36b969173fb3a0ecbd0c0340285ab64b3cadfc4ea62f85e2

See more details on using hashes here.

File details

Details for the file lovely_tensors-0.0.4-py3-none-any.whl.

File metadata

File hashes

Hashes for lovely_tensors-0.0.4-py3-none-any.whl
Algorithm Hash digest
SHA256 171aa4dd20b6bfcd68aeefeaf98ea686ddde28a778ec74866bc3ab93b07520ee
MD5 8ea4e4046e8d25a72506614e62552ab6
BLAKE2b-256 ba93888d3ec9b85dc67dd58a0f27df6fc2438649603a1d85d4f14cb46ccb8c99

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page