Time-dependent analysis of point sources in Fermi-LAT data
Project description
Making FermiLAT gamma-ray Light curves with wtlike
Quickly create a light curve for any 4FGL source, on any time scale, with optional Bayesian Block analysis
github links:
Introduction
wtlike
(Pronounced "DUB-Tee-like"), is a library optimized for interactive exploration in a Jupyter notebook with access to all Fermi-LAT data, and to analyze the time dependence of any source in the
4FGL catalog on any time scale, with the option of performing a Bayesian Block partition to select optimal time intervals. The source can be identified by the 4FGL name, or any equivalent common name.
Here is a minimal demo:
from wtlike import *
wtl = WtLike('3C 273')
SourceData: photons and exposure for 3C 273: Saving to cache with key "3C_273_data"
loading all filesSourceData: Assembling photon data and exposure for source 3C 273
from folder "/home/burnett/wtlike_data/data_files", with 671 files,
Weights from file 3C_273_weights.pkl
...............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................
SourceData: Source 3C 273 with:
data: 96,091 photons from 2008-08-04 to 2021-06-16
exposure: 2,842,552 intervals, average flux 2810 cm^2 for 84.9 Ms
rates: source 1.23e-07/s, background 2.80e-07/s, S/N ratio 4.39e-01
CellData: Bin photon data into 671 1-week bins from 54683.0 to 59380.0
LightCurve: select 661 cells for fitting with e>5 & n>2
wtl.plot();
This assumes that the name for the source, in this case the historically famous first quasar to be discovered, can be associated with a 4FGL catalog source. The plot shows, as a function of the MJD time, weekly measurements of deviations of the flux relative to the average of the 12-year interval used to define the 4FGL-DR3 catalog.
The first stage, extracting data for the source, takes ~10 min, but, using an included cache system, only has to be done once.
Overview
This package has code that was developed with the nbdev code/tests/documentation environment from the github package lat-timing to generate light curves of Fermi-LAT sources.
It is based on a paper by Matthew Kerr, which derives the weighted likelihood formalism used here, specifically with
the Bayesian Block to detect and characterize variability of a gamma-ray source.
There are several innovative design features that significantly improve the speed and portability.
- Condensed photon and spacecraft data.
- Weight tables
- A cache to improve interactivity
- Likelihood functions fit to a modified Poisson function
- Unbinned likelihood
- A simple user interface
How it works
Historically, gamma-ray source measurements have used two methods:
- For a fixed time interval, where the energy and position are used, with a model including all potential sources and a model of the detector response to define the likelihood. This has been the only way to study weak sources. A light curve must apply this to each time interval.
- Or, for very bright flares, for example GRBs, one can simply count the number of photons within a circular region, that is, aperture photometry.
Matthew Kerr introduced a third method, basically counting photons but using information from a static likelihood analysis to assign a "weight" to each photon, the probability for being from the source in question, then optimizing this likelihood. This assumes that the only thing changing is the flux of the source. He calls it "retrospective", since the analysis for the full time is then applied back to the individual photons.
Individual photon Likelihood ("unbinned")
We use a version of the Kerr likelihood where the fundamental "cell" is a single photon. The likelihood for any group of photons is easily determined by adding the photons.
Assumptions:
- Source photons are completely contained in the dataset boundaries (the "ROI").
- The instrument response is constant with time.
- The background pattern is constant. (Clearly violated if a neighboring source varies!)
For a photon $i$ with weight $w_i$ and exposure $\tau_i$,
{% raw %} $$ \displaystyle\log\mathcal{L}_i(\alpha) = \log (1 + \alpha \ w_i ) - \alpha \ w_i\ R\ \tau_i $$ {% endraw %}
where:
- $\alpha$ -- fractional deviation of the source flux from the average as measured by the catalog analysis
- $w_i$ -- probability, for the nominal source rate, that photon $i$ is from the source.
- $R$ -- expected source rate in $\mathrm{cm^{-2}\ s^{-1}}$ units.
- $\tau_i$ -- integration of the exposure rate for the live time preceding this detection in $\mathrm{cm^2} s$ units.
Live time, the time since the start of a run, or the previous photon. It is equivalent to time.
This behaves like a time, but accounts for variation of the exposure rate, often rapid with respect to event rates.
(A note about "exposure": It has units $\mathrm{cm^2\ s}$ and is the integral of the "exposure rate" over a time interval. For Fermi, the rate is typically $\mathrm{3000 cm^2}$ but varies as much as a factor of three over a single orbit.)
This is evaluated in the module loglike.
Photon Data
Fermi data are retrieved from the Fermi-LAT weekly data files extracted from the GSFC FTP server,
with subfolders for the photon data, photon
and spacecraft data, spacecraft
. It is described here. The files are organized into individual weeks, starting UTC midnight each Thursday. Files for the most recent week are updated daily.
We convert each pair of photon and spacecraft files to two DataFrame tables with the minimum information needed to evaluate the likelihood, as compressed as possible. Particularly, the energy and position are binned. Details can be seen in the module data_man.
The entire data set in this format occupies <2 GB.
Select Data for a source
All further processing uses a subset of the photons within a cone, currently $4^\circ$, about the selected source, and
evaluates the exposure during 30-s intervals for this direction. In the class
SourceData
, implemented in the module
source_data
we
- Extract photons
- Evaluate exposure, using the effective area tables and a spectral assumption.
- Determine the weight for each photon, using the table for the source. See the module weights for details.
- For each photon's livetime, determine the exposure $\tau$.
The result is a photon DataFrame, containing for each photon, the time $t$ in MJD units, $w$, and $\tau$.
This class is a superclass of the user interface class WtLike
introduced above.
Partition into cells
A "cell", the terminology used by Kerr, the set of photons in a specific time interval. The class CellData in the module cell_data, a subclass of SourceData manages this.
This class is instantiated with a tuple to define the binning in time. Denoted by (a, b, c)
, the elements are:
a
--start timeb
--stop timec
-- bin size in days, but 0 means orbit-based, intervals are contiguous eposure.
For the start and stop, values > 50000 are interpreted as MJD. Otherwise they are relative to start if positive
or stop if negative for the full dataset, both rounded to a full day. Zero means actual start for a
and stop for b
.
The default binning is simply (0, 0, 7)
for weekly bins with the full dataset. Hourly for the most recent week would be (-7, 0, 1/24)
.
A DataFrame table of the cells is created as a data member cells
, with content
t
-- MJD time at cell centertw
-- cell width in dayse
-- cell exposure, for referencen
-- the number of photonsw
-- a list ofn
weightsS
-- expected number source photons, the nominal source rate times the sum of $\tau$ values.B
-- expected number of background photons (unused)
Views
CellData
implements a method view
, which accepts a binning tuple as an argument, returns a copy of the current object, which can be a subclass, assigning to it the binning. Thus the view has all the attributes of its parent, but
with a different set of cells.
So the following creates a new WtLike object that we generated above, rebins a copy with 25-day bins in the first 100 days, generates a list of the cells, then removes it since it wasn't assigned a reference variable.
wtl.view(0,100,25).cells
CellData: Bin photon data into 4 4-week bins from 54683.0 to 54783.0
LightCurve: select 4 cells for fitting with e>5 & n>2
.dataframe tbody tr th {
vertical-align: top;
}
.dataframe thead th {
text-align: right;
}
</style>
t | tw | e | n | w | S | B | |
---|---|---|---|---|---|---|---|
0 | 54695.5 | 25.0 | 1526.99 | 555 | [0.48901367, 0.8066406, 0.11303711, 0.19165039... | 187.55 | 427.44 |
1 | 54720.5 | 25.0 | 1916.42 | 1448 | [0.4345703, 0.6064453, 0.0690918, 0.056274414,... | 235.38 | 536.45 |
2 | 54745.5 | 25.0 | 1488.15 | 1190 | [0.33911133, 0.3100586, 0.7089844, 0.06994629,... | 182.78 | 416.57 |
3 | 54770.5 | 25.0 | 1979.26 | 1185 | [0.09112549, 0.58251953, 0.07537842, 0.5004883... | 243.10 | 554.04 |
Evaluate Likelihoods and make light curve plots
The class LightCurve
, implemented in the module lightcurve
is a subclass of SourceData
.
An instance invokes its superclass to generate the set of cells, then evaluates the likelihoods.
Poisson-like Likelihood
We fit the likelihood for each cell, using only a few evaluations, to a 3-parameter Poisson-like formula. Two advantages of this are:
- efficiency -- for large numbers of photons, this is much faster
- convenience -- the
Poisson
object implements functions that return the TS, 95% CL, maximum, and uncertainties, using the incomplete gamma function.
This results in a DataFrame fits
containing
t, tw, n
, from the cell, andfit
, thePoisson
object
Light curve plots
The function Likelihood.plot
actually invokes flux_plot
Apply Bayesian Blocks
The class WtLike
,
implememted in the module main
, adds an implementation of the Bayesian Block (BB) algorithm, from the module bayesian
using likelihood instead of counts.
There we have two subclasses of astropy.stats.bayesian_blocks
, CountFitness
and the default LikelihoodFitness
.
This code creates partitions between boundaries of a set of cells. Usage is via a special view, bb_view`
bb = wtl.bb_view()
bb.plot();
LightCurve: select 661 cells for fitting with e>5 & n>2
Bayesian Blocks: using penalty 0.05
Partitioned 661 cells into 94 blocks, using LikelihoodFitness
LightCurve: Loaded 94 / 94 cells for fitting
As you see, this made 94 blocks from the 656 weeks, fit each, and overplotted in on the weekly light curve.
Simulation
Finally, a simulation option is available. See the tutorial for an example
Installation
Note that this is in beta mode. It requires: matplotlib pandas scipy astropy healpy
To install from pyPI:
pip install wtlike
Data requirements: There are three sets of files:
-
photon data
These are a set of weekly pickled pythondict
objects with compressed condensed photon and spacecraft data extracted from the GSFC FTP site. They contain every photon above 100 MeV, and less than $100^\circ$ from the zenith. -
weight tables
Each source to be analyzed needs a table defining the photon weight as a function of position, energy, and event type. These are currently generated by pointlike. (Afermipy
-generated version would be preferable.) -
effective area
A standard Fermi instrument response file (IRF) defining the effective area as a function of detector angle and energy.
A set of these is available as a 1.6 GB zip file.
Input data
There are three data sources which wtlike
needs to function:
- The photon/spacecraft data
- A table of weights for each source
- An effective area IRF table
These must be found under a folder, which by default is ~/wtlike_data
. In that folder there must be (perhaps links to) three folders named data_files
, weight_files
, aeff_files
. A copy of what I'm using is at /afs/slac/g/glast/users/burnett/wtlike_data
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file wtlike-0.2.3.tar.gz
.
File metadata
- Download URL: wtlike-0.2.3.tar.gz
- Upload date:
- Size: 70.6 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.7.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 1fc6dafd73ef4a5c9577beff5ef41e88cafcd1bfddd5bb371076a7e9998b823a |
|
MD5 | a4f4c35341665b0390d7198a85f2de9c |
|
BLAKE2b-256 | 326a49e68752a2db475acd3f5cc96f4109509d453965e4a0a334e8f4b7271baa |
File details
Details for the file wtlike-0.2.3-py3-none-any.whl
.
File metadata
- Download URL: wtlike-0.2.3-py3-none-any.whl
- Upload date:
- Size: 73.0 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/3.4.1 importlib_metadata/3.10.0 pkginfo/1.7.0 requests/2.25.1 requests-toolbelt/0.9.1 tqdm/4.59.0 CPython/3.7.6
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | a0cfaec298497d593314ed69f656752653e3b491f7c2bcb02065df637289326c |
|
MD5 | 485cf968a4fc4a8840f9a05d1f203daa |
|
BLAKE2b-256 | 17414171f7327243e42dd3f56600a0ae36f163396d3678c340e9bd49196d45d4 |