Skip to main content

Time-dependent analysis of point sources in Fermi-LAT data

Project description

The wtlike package

Code for generating fermi-LAT light curves.

GitHub Links

Introductinon

This package has code that was adapted to the nbdev code/tests/documentation environment from the github package lat-timing to manage light curves of Fermi-LAT sources.
It is based on a paper by Matthew Kerr, which derives the weighted likelihood formalism used here, specifically with the Bayesian Block to detect and characterize variability of a gamma-ray source.

Also, I've ported some code from my jupydoc documentation package supporting enhanced documentation combining Markdown and code, such that the Markdown reflects execution of the code.

Installation

Note that this is still in alpha mode.

To install from pip:

pip install wtlike

Data requirements: There are three sets of files:

  • photon data
    These are a set of weekly pickled python dict objects with compressed condensed photon and spacecraft data extracted from the GSFC FTP site. They contain every photon above 100 MeV, and less than $100^\circ$ from the zenith.

  • weight tables
    Each source to be analyzed needs a table defining the photon weight as a function of position, energy, and event type. These are currently generated by pointlike. (A fermipy-generated version would be preferable.)

  • effective area
    A standard fermi instrument response file (IRF) defining the effective area as a function of detector angle and energy.

A set of these is available as a 1.6 GB zip file.

Quick Demo

The following code cell loads the data for the BL Lac blazar, and plots by default, a weekly light curve for the full fermi mission.

from wtlike import *
weekly = WtLike('BL Lac') # how to define 7-day bins for the full dataset.
weekly.plot(ylim=(-0.8,15)); #plot takes plt.plot args.
SourceData: photons and exposure for BL Lac: Saving to cache with key "BL Lac_data"
	Assembling photon data and exposure for source BL Lac from folder "/home/burnett/wtlike_data/data_files",
	 with 665 files, last file:  week_674.pkl: loading all files
.........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
Load weights from file /mnt/d/wtlike/wtlike_data/weight_files/BL_Lac_weights.pkl
	Found: P88Y6076 at (92.60, -10.44)
	Applyng weights: 0 / 406640 photon pixels are outside weight region
	95671 weights set to NaN
WtLike: Source BL Lac with:
	 data:       310,969 photons from   2008-08-04 to 2021-05-06
	 exposure: 3,177,752 intervals from 2008-08-04 to 2021-05-06
CellData: Bin photon data into 665 1-week bins from 54683.0 to 59338.0
LightCurve: select 656 cells for fitting with e>0.5 & n>2

png

The variable weekly has lots of capabilities. To examine a subset of the data at the end of the current data, we use view to create a new WtLike object and plot it.

hourly_at_end = weekly.view((-5,0, 1/24)) # for the last 5 days, 1-hour bins
hourly_at_end.plot(); # Accepts plt.plot args, e.g. xlim, ylim, etc.
CellData: Bin photon data into 120 1-hour bins from 59335.0 to 59340.0
LightCurve: select 81 cells for fitting with e>0.5 & n>2

png

Or, to do a Bayesian Block partition with these 1-hour bins, perform fits, and overplot the result, just run the following.

bb_hourly = hourly_at_end.bb_view()
bb_hourly.plot();
LightCurve: select 81 cells for fitting with e>0.5 & n>2
Partitioned 81 cells into 4 blocks, using LikelihoodFitness 
LightCurve: Loaded 4 / 4 cells for fitting

png

Finally, let's look at the values plotted above:

bb_hourly.fluxes
<style scoped> .dataframe tbody tr th:only-of-type { vertical-align: middle; }
.dataframe tbody tr th {
    vertical-align: top;
}

.dataframe thead th {
    text-align: right;
}
</style>
t tw n ts flux errors limit
0 59335.42 0.83 178 404.1 6.70 (-0.655, 0.689) 7.89
1 59336.69 1.71 205 170.0 2.38 (-0.308, 0.323) 2.93
2 59338.02 0.96 222 573.6 8.70 (-0.734, 0.767) 10.01
3 59339.23 1.46 217 369.4 4.48 (-0.434, 0.454) 5.25

Input data

There are three data sources which wtlike needs to function:

  • The photon/spacecraft data
  • A table of weights for each source
  • An effective area IRF table

These must be found under a folder, which by default is ~/wtlike_data. In that folder there must be (perhaps links to) three folders named data_files, weight_files, aeff_files. A copy of what I'm using is at /afs/slac/g/glast/users/burnett/wtlike_data

Module summary

Configuration config

Implements basic configuration information, Config, a cache system Cache, point source info PointSource, and time conversion

Photon and Spacecraft Data data_man

This module manages conversion of the weekly FT1 (photons) and FT2 (spacecraft) files, downloaded from GSFC, to a folder containing pickled files, each with tables of photons, space craft data, and a list of GTI times derived from the FT1 file. A class WeeklyData exports the results.

Source data source_data

The module depends on a specific source. It extracts the photons within a disk, and calculates the exposure for this direction. It assumes that a weigtht analysis has been done for this source, which it uses to apply a weight to each photon. This is handled by the class SourceData. It depends on weights and effective_ares to evaluate exposure.

Cell data cell_data

The next step is to define a set of time bins, or "cells". This module, implementing the class CellData(SourceData), creates a set of cells.

The light-curve lightcurve

The the class LightCurve(CellData) uses the set of cells created by its superclass, and generates a likelihood function for each according to Kerr Eqn 2. These functions are represented by 3-parameter Poisson-like (see poisson) functions for further analysis. It creates a table with this information for plotting a light curve.

Bayesian Blocks bayesian

This module contains the code implementing the Bayesian block capability.

Simulation simulation

A light curve can be also generated with a simulation.

Main main

Implements WtLike(LightCurve), a subclass of LightCurve, to which it adds the function bb_view, returning a new object with BB cells. Its plot funtion generates a light-curve plot showing the cells of its parent, overplotted tiwh the BB points.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

wtlike-0.2.2.tar.gz (59.3 kB view details)

Uploaded Source

Built Distribution

wtlike-0.2.2-py3-none-any.whl (64.2 kB view details)

Uploaded Python 3

File details

Details for the file wtlike-0.2.2.tar.gz.

File metadata

  • Download URL: wtlike-0.2.2.tar.gz
  • Upload date:
  • Size: 59.3 kB
  • Tags: Source
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.55.1 CPython/3.7.6

File hashes

Hashes for wtlike-0.2.2.tar.gz
Algorithm Hash digest
SHA256 42e4d1f3e135c42958c0b53a2a4d4e98172663e933e870c29a712694e06185a7
MD5 f578881266f82d7b7ebfe364d1d6aa3b
BLAKE2b-256 ddc5d6dd4ae2012cd235b67db65082c5a7ed457a05191c23729d98fc028f42c6

See more details on using hashes here.

File details

Details for the file wtlike-0.2.2-py3-none-any.whl.

File metadata

  • Download URL: wtlike-0.2.2-py3-none-any.whl
  • Upload date:
  • Size: 64.2 kB
  • Tags: Python 3
  • Uploaded using Trusted Publishing? No
  • Uploaded via: twine/3.4.1 importlib_metadata/4.0.1 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.55.1 CPython/3.7.6

File hashes

Hashes for wtlike-0.2.2-py3-none-any.whl
Algorithm Hash digest
SHA256 68b0f5ffa9a863db461799cd839565609dcddde49f90a68111daf91565e2da6a
MD5 b526a60dbdae0621c1e3e9ef207e892b
BLAKE2b-256 b316c620fff74bfee76dedd0aee2ccdf5ef36f50a6f5a670a4759c8260afa61d

See more details on using hashes here.

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page