Skip to main content

Transit modelling in Pytorch

Project description

PyLightcurve-torch

An exoplanet transit modelling package for deep learning applications in Pytorch.

See this open publication in the Publications of the Astronomical Society of the Pacific for more details and official citation.

The code for orbit and flux drop computation is adapted from Pylightcurve.

The module pylightcurve_torch.functional.py contains the functions implemented in Pytorch and computing the orbital positions, transit durations and flux drops. (see PyLightcurve repository for more information about the numerical models used).

A TransitModule class is implemented in pylightcurve_torch.nn.py with the following features:

  • Computes time series of planetary positions and primary/secondary flux drops
  • it inherits torch.nn.Module class to benefit from its parameters optimisation and management capabilities and facilitated combination with neural networks
  • native GPU compatibility

Installation

$ pip install pylightcurve-torch

Basic use

from pylightcurve_torch import TransitModule

tm = TransitModule(time, **transit_params)

flux_drop = tm()

If needs be, the returned torch.Tensor can be converted to a numpy.ndarrray using flux_drop.numpy() torch method or flux.cpu().numpy() if the computation took place on a gpu.

Transit parameters

Below is a summary table of the planetary orbital and transit parameters use in PyLightcurve-torch:

Name Pylightcurve alias Description Python type Unit Transit type
a sma_over_rs ratio of semi-major axis by the stellar radius float unitless primary/secondary
P period orbital period float days primary/secondary
e eccentricity orbital eccentricity float unitless primary/secondary
i inclination orbital inclination float degrees primary/secondary
p periastron orbital argument of periastron float degrees primary/secondary
t0 mid_time transit mid-time epoch float days primary/secondary
rp rp_over_rs ratio of planetary by stellar radii float unitless primary/secondary
method method limb-darkening law str primary
ldc limb_darkening_coefficients limb-darkening coefficients list unitless primary
fp fp_over_fs ratio of planetary by stellar fluxes float unitless secondary

A short version of each parameter has been introduced, while maintaining a compatibility with origin PyLightcurve parameter names. All the parameters except method are converted to torch.Parameters when passed to a ``TransitModule```, with double dtype.

Differentiation

One of the main benefits of having a pytorch implementation for modelling transits is offered by its automatic differentiation feature with torch.autograd, stemming from autograd library.

Here is an example of basic usage:

...
tm.fit_param('rp')                  # activates the gradient computation for parameter 'rp'
err = loss(flux, **data)            # loss computation in pytorch 
err.backward()                      # gradients computation 
tm.rp.grad                          # access to computed gradient for parameter 'rp'

More Pytorch support

Several utility methods inherited from PyTorch modules are listed below, simplifying operations on all module's defined tensor parameters.

tm = TransitModule()

# Parameters access (iterators)
tm.parameters()
tm.named_parameters()

# dtype conversions
tm.float()
tm.double()

# Gradient local deactivation
with torch.no_grad():
    flux_no_grad = tm()

# device conversion
tm.cpu()
tm.cuda()

Running performance tests

In addition to traditional unit tests, computation performance tests can be executed this way:

 python tests/performance.py --plot

This will measure the computation time for computing forward transits as a function of transit duration, time vector length or batch size. If data have been savec previously, these will be plotted to with the name of the corresponding tag.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pylightcurve-torch-1.0.2.tar.gz (108.8 kB view details)

Uploaded Source

Built Distribution

pylightcurve_torch-1.0.2-py3-none-any.whl (29.7 kB view details)

Uploaded Python 3

File details

Details for the file pylightcurve-torch-1.0.2.tar.gz.

File metadata

  • Download URL: pylightcurve-torch-1.0.2.tar.gz
  • Upload date:
  • Size: 108.8 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.25.1 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.8

File hashes

Hashes for pylightcurve-torch-1.0.2.tar.gz
Algorithm Hash digest
SHA256 a5b758e2660eba0d99ffed007584f0f99a2df74589a9e38f5e0f9f3308b0e1dd
MD5 0d0d050b372da24eddb6f79742b19368
BLAKE2b-256 4e702e3da804344a65e0022b36f6848585c26d3043cb3b6c2b7e1635cc16cee6

See more details on using hashes here.

File details

Details for the file pylightcurve_torch-1.0.2-py3-none-any.whl.

File metadata

  • Download URL: pylightcurve_torch-1.0.2-py3-none-any.whl
  • Upload date:
  • Size: 29.7 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.2.0 pkginfo/1.5.0.1 requests/2.25.1 setuptools/50.3.2 requests-toolbelt/0.9.1 tqdm/4.56.0 CPython/3.8.8

File hashes

Hashes for pylightcurve_torch-1.0.2-py3-none-any.whl
Algorithm Hash digest
SHA256 27490e6d9cee4c444643a187654fc3eb74ba1480e29d6b7524e72d987a68c155
MD5 935a218e96c41780211bc0bdbd4352c3
BLAKE2b-256 7782928c17f27d2a6558995be507cffaba02205f862255a9c343cdce6cb3f22d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page