Lovely Tensors
Project description
Lovely Tensors
Install
pip install lovely-tensors
How to use
How often do you find yourself debuggin a neural network? You dump a tensor to the cell output, and see this:
numbers
tensor([[[-0.3541, -0.3369, -0.4054, ..., -0.5596, -0.4739, 2.2489],
[-0.4054, -0.4226, -0.4911, ..., -0.9192, -0.8507, 2.1633],
[-0.4739, -0.4739, -0.5424, ..., -1.0390, -1.0390, 2.1975],
...,
[-0.9020, -0.8335, -0.9363, ..., -1.4672, -1.2959, 2.2318],
[-0.8507, -0.7822, -0.9363, ..., -1.6042, -1.5014, 2.1804],
[-0.8335, -0.8164, -0.9705, ..., -1.6555, -1.5528, 2.1119]],
[[-0.1975, -0.1975, -0.3025, ..., -0.4776, -0.3725, 2.4111],
[-0.2500, -0.2325, -0.3375, ..., -0.7052, -0.6702, 2.3585],
[-0.3025, -0.2850, -0.3901, ..., -0.7402, -0.8102, 2.3761],
...,
[-0.4251, -0.2325, -0.3725, ..., -1.0903, -1.0203, 2.4286],
[-0.3901, -0.2325, -0.4251, ..., -1.2304, -1.2304, 2.4111],
[-0.4076, -0.2850, -0.4776, ..., -1.2829, -1.2829, 2.3410]],
[[-0.6715, -0.9853, -0.8807, ..., -0.9678, -0.6890, 2.3960],
[-0.7238, -1.0724, -0.9678, ..., -1.2467, -1.0201, 2.3263],
[-0.8284, -1.1247, -1.0201, ..., -1.2641, -1.1596, 2.3786],
...,
[-1.2293, -1.4733, -1.3861, ..., -1.5081, -1.2641, 2.5180],
[-1.1944, -1.4559, -1.4210, ..., -1.6476, -1.4733, 2.4308],
[-1.2293, -1.5256, -1.5081, ..., -1.6824, -1.5256, 2.3611]]])
Was it really useful?
What is the shape?
What are the statistics?
Are any of the values nan
or int
?
Is it an image of a man holding a tench?
import lovely_tensors.tensors as lt
lt.PRINT_OPTS.color=True
# A very short tensor - no min/max
print(lt.lovely(numbers.view(-1)[:2]))
# A slightly longer tensor
print(lt.lovely(numbers.view(-1)[:6].view(2,3)))
tensor[2] μ=-0.345 σ=0.012 x=[-0.354, -0.337]
tensor[2, 3] n=6 x∈[-0.440, -0.337] μ=-0.388 σ=0.038 x=[[-0.354, -0.337, -0.405], [-0.440, -0.388, -0.405]]
lt.lovely(numbers)
tensor[3, 196, 196] n=115248 x∈[-2.118, 2.640] μ=-0.388 σ=1.073
lt.lovely(numbers, depth=1)
tensor[3, 196, 196] n=115248 x∈[-2.118, 2.640] μ=-0.388 σ=1.073
tensor[196, 196] n=38416 x∈[-2.118, 2.249] μ=-0.324 σ=1.036
tensor[196, 196] n=38416 x∈[-1.966, 2.429] μ=-0.274 σ=0.973
tensor[196, 196] n=38416 x∈[-1.804, 2.640] μ=-0.567 σ=1.178
t = numbers.view(-1)[:12].clone()
t[0] *= 10000
t[1] /= 10000
t[2] = float('inf')
t[3] = float('-inf')
t[4] = float('nan')
t = t.reshape((2,6))
tensor([[-3.5405e+03, -3.3693e-05, inf, -inf, nan, -4.0543e-01],
[-4.2255e-01, -4.9105e-01, -5.0818e-01, -5.5955e-01, -5.4243e-01, -5.0818e-01]])
# A spicy tensor
lt.lovely(t)
tensor[2, 6] n=12 x∈[-3.541e+03, -3.369e-05] μ=-393.776 σ=1.180e+03 +inf! -inf! nan!
# A zero tensor
lt.lovely(torch.zeros(10, 10))
tensor[10, 10] all_zeros
Now the important queston - is it our man?
lt.rgb(numbers)
Maaaaybe? Looks like someone normalized him.
in_stats = { "mean": (0.485, 0.456, 0.406),
"std": (0.229, 0.224, 0.225) }
lt.rgb(numbers, in_stats)
There can be no doubt, it’s out hero the Tenchman!
One last thing - let’s monkey-patch torch.Tensor
for convenience.
lt.monkey_patch()
t
tensor[2, 6] n=12 x∈[-3.541e+03, -3.369e-05] μ=-393.776 σ=1.180e+03 +inf! -inf! nan!
t.verbose
tensor[2, 6] n=12 x∈[-3.541e+03, -3.369e-05] μ=-393.776 σ=1.180e+03 +inf! -inf! nan!
x=[[-3.5405e+03, -3.3693e-05, inf, -inf, nan, -4.0543e-01],
[-4.2255e-01, -4.9105e-01, -5.0818e-01, -5.5955e-01, -5.4243e-01, -5.0818e-01]]
t.plain
[[-3.5405e+03, -3.3693e-05, inf, -inf, nan, -4.0543e-01],
[-4.2255e-01, -4.9105e-01, -5.0818e-01, -5.5955e-01, -5.4243e-01, -5.0818e-01]]
numbers.rgb
# you can also do numbers.rgb()
#per-channel stats
numbers.deeper
tensor[3, 196, 196] n=115248 x∈[-2.118, 2.640] μ=-0.388 σ=1.073
tensor[196, 196] n=38416 x∈[-2.118, 2.249] μ=-0.324 σ=1.036
tensor[196, 196] n=38416 x∈[-1.966, 2.429] μ=-0.274 σ=0.973
tensor[196, 196] n=38416 x∈[-1.804, 2.640] μ=-0.567 σ=1.178
# You can go even deeper if you want to
dt = torch.randn(3, 3, 5)
dt.deeper(2)
tensor[3, 3, 5] n=45 x∈[-1.398, 3.680] μ=0.312 σ=1.020
tensor[3, 5] n=15 x∈[-1.198, 3.680] μ=0.289 σ=1.199
tensor[5] x∈[-1.198, 3.680] μ=0.702 σ=1.804 x=[3.680, 0.083, 0.621, 0.322, -1.198]
tensor[5] x∈[-1.122, 0.467] μ=-0.426 σ=0.739 x=[-1.122, 0.255, -1.029, -0.699, 0.467]
tensor[5] x∈[-0.206, 1.124] μ=0.592 σ=0.512 x=[0.678, 1.124, 0.913, -0.206, 0.451]
tensor[3, 5] n=15 x∈[-1.398, 1.547] μ=0.207 σ=0.859
tensor[5] x∈[-0.300, 1.252] μ=0.407 σ=0.597 x=[0.558, 1.252, -0.006, 0.532, -0.300]
tensor[5] x∈[-0.378, 1.547] μ=0.642 σ=0.900 x=[-0.378, 1.547, -0.252, 1.332, 0.964]
tensor[5] x∈[-1.398, 0.654] μ=-0.428 σ=0.791 x=[0.021, -0.833, 0.654, -1.398, -0.583]
tensor[3, 5] n=15 x∈[-0.969, 2.827] μ=0.440 σ=1.033
tensor[5] x∈[-0.744, 1.555] μ=0.373 σ=0.820 x=[0.222, 1.555, -0.744, 0.326, 0.507]
tensor[5] x∈[-0.441, 2.827] μ=1.072 σ=1.291 x=[0.428, 2.827, -0.441, 1.905, 0.641]
tensor[5] x∈[-0.969, 0.825] μ=-0.124 σ=0.704 x=[-0.218, -0.553, -0.969, 0.297, 0.825]
# A quick de-norm. Don't worry, the data stays the same.
numbers.rgb(in_stats)
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
lovely-tensors-0.0.3.tar.gz
(12.4 kB
view details)
Built Distribution
File details
Details for the file lovely-tensors-0.0.3.tar.gz
.
File metadata
- Download URL: lovely-tensors-0.0.3.tar.gz
- Upload date:
- Size: 12.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 206c8482dfde4a1e05735bf48ee3381d127ddda82c44be1e392e680417be6acf |
|
MD5 | 3e3861ca163854f3c63f50a8e504076b |
|
BLAKE2b-256 | d7977f744e87442d5b7e4bae41b9aefd17f3c79f7cd158b59056420f8ed7c522 |
File details
Details for the file lovely_tensors-0.0.3-py3-none-any.whl
.
File metadata
- Download URL: lovely_tensors-0.0.3-py3-none-any.whl
- Upload date:
- Size: 14.6 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.1 CPython/3.10.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 2a66ea6a0bebf8f1f876d77c5a4a2717a3c136cf64cc0dcba3550ad714cd79a3 |
|
MD5 | 40d1f6895de6eabf4269fc3308ebd4c3 |
|
BLAKE2b-256 | b71b1a1a8ae36cf8c2d9f14598c8ddc12d5b49c0ec1d1e21c1d74c36a4cd8c4d |