Skip to main content

Transit modelling in Pytorch

Project description

PyLightcurve-torch

An exoplanet transit modelling package for deep learning applications in Pytorch.

The code for orbit and flux drop computation is largely adapted from https://github.com/ucl-exoplanets/pylightcurve/ (under a MIT license).

The module pylightcurve_torch.functional.py contains the functions implemented in Pytorch and computing the orbital positions, transit durations and flux drops. (see PyLightcurve repository for more information about the numerical models used).

A TransitModule class is implemented in pylightcurve_torch.nn.py with the following features:

  • Computes time series of planetary positions and primary/secondary flux drops
  • it inherits torch.nn.Module class to benefit from its parameters optimisation and management capabilities and facilitated combination with neural networks
  • native GPU compatibility

Installation

$ pip install pylightcurve-torch

Basic use

from pylightcurve_torch import TransitModule

tm = TransitModule(time, **transit_params)

flux_drop = tm()

If needs be, the returned torch.Tensor can be converted to a numpy.ndarrray using flux_drop.numpy() torch method or flux.cpu().numpy() if the computation took place on a gpu.

Transit parameters

Below is a summary table of the planetary orbital and transit parameters use in PyLightcurve-torch:

Name Pylightcurve alias Description Python type Unit Transit type
a sma_over_rs ratio of semi-major axis by the stellar radius float unitless primary/secondary
P period orbital period float days primary/secondary
e eccentricity orbital eccentricity float unitless primary/secondary
i inclination orbital inclination float degrees primary/secondary
p periastron orbital argument of periastron float degrees primary/secondary
t0 mid_time transit mid-time epoch float days primary/secondary
rp rp_over_rs ratio of planetary by stellar radii float unitless primary/secondary
method method limb-darkening law str primary
ldc limb_darkening_coefficients limb-darkening coefficients list unitless primary
fp fp_over_fs ratio of planetary by stellar fluxes float unitless secondary

A short version of each parameter has been introduced, while maintaining a compatibility with origin PyLightcurve parameter names. All the parameters except method are converted to torch.Parameters when passed to a ``TransitModule```, with double dtype.

Differentiation

One of the main benefits of having a pytorch implementation for modelling transits is offered by its automatic differentiation feature with torch.autograd, stemming from autograd library.

Here is an example of basic usage:

...
tm.fit_param('rp')                  # activates the gradient computation for parameter 'rp'
err = loss(flux, **data)            # loss computation in pytorch 
err.backward()                      # gradients computation 
tm.rp.grad                          # access to computed gradient for parameter 'rp'

More Pytorch support

Several utility methods inherited from PyTorch modules are listed below, simplifying operations on all module's defined tensor parameters.

tm = TransitModule()

# Parameters access (iterators)
tm.parameters()
tm.named_parameters()

# dtype conversions
tm.float()
tm.double()

# Gradient local deactivation
with torch.no_grad():
    flux_no_grad = tm()

# device conversion
tm.cpu()
tm.cuda()

Running performance tests

In addition to traditional unit tests, computation performance tests can be executed this way:

 python tests/performance.py --plot

This will measure the computation time for computing forward transits as a function of transit duration, time vector length or batch size. If data have been savec previously, these will be plotted to with the name of the corresponding tag.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pylightcurve-torch-1.0.0.tar.gz (18.3 kB view details)

Uploaded Source

Built Distribution

pylightcurve_torch-1.0.0-py3-none-any.whl (29.6 kB view details)

Uploaded Python 3

File details

Details for the file pylightcurve-torch-1.0.0.tar.gz.

File metadata

  • Download URL: pylightcurve-torch-1.0.0.tar.gz
  • Upload date:
  • Size: 18.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.7.7

File hashes

Hashes for pylightcurve-torch-1.0.0.tar.gz
Algorithm Hash digest
SHA256 d6e9e57d26b7bbb0be79acdf96ef11c18a85473539134ae9365aac6ee26e54cf
MD5 0abad21a671122a5f340eb40b8e36cf2
BLAKE2b-256 c540b6d614dc19b798b2e25693c59c38c52c9c967b49a09303f2ef804e0e2666

See more details on using hashes here.

File details

Details for the file pylightcurve_torch-1.0.0-py3-none-any.whl.

File metadata

  • Download URL: pylightcurve_torch-1.0.0-py3-none-any.whl
  • Upload date:
  • Size: 29.6 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.24.0 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.43.0 CPython/3.7.7

File hashes

Hashes for pylightcurve_torch-1.0.0-py3-none-any.whl
Algorithm Hash digest
SHA256 b53bc8ef70f756a92ee47701a6ab491f7698fecac09a2d9e004c0b32526a9433
MD5 1557489a35338d4da1f5b223265dd50d
BLAKE2b-256 5c235467adca0886e671f588cae2b747db22784533ca37f4f20c85c2fcf07f0a

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page