Super Resolution tools with Jax/Flax
Project description
FlaxSR
Super Resolution models with Jax/Flax
Currently, Flax is using CUDA/CuDNN from wheel, but TensorFlow is using local CUDA/CuDNN, which is causing conflicts. We will fix it as soon as possible.<\b>
HOW TO USE
Install
pip install flaxsr
Usage
You can easily load model/losses and train model using custom train_states.
- Train example
import flaxsr
import jax
import jax.numpy as jnp
import numpy as np
import optax
model_kwargs = {
'n_filters': 64, 'n_blocks': 8, 'scale': 4
}
model = flaxsr.get("models", "vdsr", **model_kwargs) # This equals flaxsr.models.VDSR(**model_kwargs)
losses = [
flaxsr.losses.L1Loss(reduce='sum'),
flaxsr.get('losses', 'vgg', feats_from=(6, 8, 14,), before_act=False, reduce='mean')
]
loss_weights = (.1, 1.)
loss_wrapper = flaxsr.losses.LossWrapper(losses, loss_weights)
params = model.init(jax.random.PRNGKey(0), jnp.ones((1, 8, 8, 3), dtype=jnp.float32))
tx = optax.adam(1e-3)
state = flaxsr.training.TrainState.create(
apply_fn=model.apply, params=params, tx=tx, losses=loss_wrapper
)
hr = jnp.ones((1, 32, 32, 3), dtype=jnp.float32)
lr = jnp.ones((1, 8, 8, 3), dtype=jnp.float32)
batch = (lr, hr)
state_new, loss = flaxsr.training.discriminative_train_step(state, batch)
assert state_new.step == 1
np.not_equal(state_new.params['params']['Conv_0']['kernel'], state.params['params']['Conv_0']['kernel'])
flaxsr.get keywords
-
models
- SRCNN: srcnn
- FSRCNN: fsrcnn
- ESPCN: espcn
- VDSR: vdsr
- EDSR: edsr
- MDSR: mdsr
- SRResNet: srresnet
- SRGAN: srgan
- NCNet: ncnet
-
losses
- L1Loss: l1
- L2Loss: l2
- CharbonnierLoss: charbonnier
- VGGLoss: vgg
- MinmaxDriscriminatorLoss: minmax_discriminator
- MinmaxGeneratorLoss: minmax_generator
- LeastSquareDiscriminatorLoss: least_square_discriminator
- LeastSquareGeneratorLoss: least_square_generator
- RelativisticDiscriminatorLoss: relativistic_discriminator
- RelativisticGeneratorLoss: relativistic_generator
- TotalVariationLoss: tv
- FrequencyReconstructionLoss: freq_recon
- EdgeLoss: edge
-
layers
- DropPath: droppath
- DropPathFast: droppath_fast
- PixelShuffle: pixelshuffle
- NearestConv: nearestconv
-
train_step
- discriminative_train_step: discriminative
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Filter files by name, interpreter, ABI, and platform.
If you're not sure about the file name format, learn more about wheel file names.
Copy a direct link to the current filters
File details
Details for the file flaxsr-0.0.7.tar.gz.
File metadata
- Download URL: flaxsr-0.0.7.tar.gz
- Upload date:
- Size: 17.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
2d4fea02bcde056b27f91b36df4c6122cbe7348807d4da6c1ce4059c68533432
|
|
| MD5 |
577e9d0a9d15db7ba168ccab8249dd02
|
|
| BLAKE2b-256 |
94c600f33305e9bbf5f5b39d49840acd3a5e66895ad251b81b586c86a9df55c6
|
File details
Details for the file flaxsr-0.0.7-py3-none-any.whl.
File metadata
- Download URL: flaxsr-0.0.7-py3-none-any.whl
- Upload date:
- Size: 24.4 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/4.0.2 CPython/3.11.3
File hashes
| Algorithm | Hash digest | |
|---|---|---|
| SHA256 |
99682201cf5555ee44ded2ccb2fc207faa1a9389a6fe208f2707fe49479cdaa2
|
|
| MD5 |
8a846aabb95faa384fe65f52091c2805
|
|
| BLAKE2b-256 |
0a98c9428173eb2c6a07babd1e8e98e7a562944109ab2bae0f7fe0df106af1a8
|