from wtlike import Config, Simulation, WtLike
import numpy as np
This assumes that you already have a Jupyter Notebook Server configured on your machine.
Either run this notebook, or copy the cells into a new notebook.
wtlike
is on PyPI--to install it,
pip install wtlike
Next we generate a simulated on-the-fly dataset in lieu of downloading the 3-GB of so of the fermi data.
def src_flare(t, tzero=15, width=1, amp=5):
return 1e-6*(1 + amp*np.exp(-(t-tzero)**2/2/width))
We use it to tell the simulation that the flux has this behavior. Note defaults:
- background -- 1e-6 /s
- effective area -- 3000 cm^2
Here we create a Simulation
object, then pass it to the primary wtlike entry point, the class WtLike
.
We tell it to immediately bin all the photons into daily time-bins, called "cells". (The time binning can be easily redone.)
config=Config()
test_sim = Simulation('flare_sim', src_flux=src_flare, tstart=0, tstop=30, )
daily = WtLike(test_sim, config=config, time_bins=(0,0,1))
Now make a light curve!
daily.plot();
The cells have each been analyzed to create a likelihood function representation. The first few cells are:
daily.cells.head(2)
The Kerr likelihood function for a cell is a function of , the list of weights, and , an estimate for the total signal counts in the cell
where is the variation from the nominal zero. The background is assumed to be constant here, as it usually is the region surrounding sources of interest. The relative flux is .
We generate an approximate representation of this function by fitting it to a 3-parameter Poisson-like function, which easily provides the values of interest.
daily.fluxes.head(2)
About DataFrames
The properties photons
, exposure
, cells
, fits
, and fluxes
are pandas.DataFrame objects.
For those unfamiliar with pandas, note that one can create "csv" files with the method to_csv(
filename)
.
The function query
is very useful: To select the cells with flux>3,
daily.fluxes.query('flux>3')
Views
A WtLike
object provides a function view
, which returns a copy but with a different binning.
So if we want a detailed look at the flare, we can choose an interval. The parameters specify an interval 10 days after the start, and 10 days before the end, with 1/day or 6-hour bins.
qday = daily.view(10, -10, 0.25)
qday.plot();
bb = qday.bb_view();bb.plot();
This creates variable-sized cells corresponding to the Bayesian Block analysis, the fits for which are shown here:
bb.fluxes
Simulation = Simulation
Simulation?
WtLike?
The three WtLike methods have (preliminary) help as well:
WtLike.plot?
WtLike.view?
Using Fermi data.
The only change from the above code is that one substitues PointSource
with the designation of a source, for Simulation
.
Also, the data files must be available, the config parameter datapath
set to the relevant folder.
Currently it is set by a line in ~/.config/wtlike/config.yaml
. See the Config help.
This requires access to a packaged form of the photon data, a table allowing generation of weights, and the effective area table.
The source must have been analyzed with gtlike
or pointlike
to produce the weight table.
All are available in a 2-GB zip file.
The photon and spacecraft data can be checked with check_data
The configuration must be valid, with path to data existing
from wtlike import *
config = Config()
if config.valid:
#print(check_data())
source = PointSource('3C 279');
print(source)
wtl = WtLike(source, exp_min=50)
If the data for this source has not been accessed on this machine before, it must be extracted to a cache, a process that takes some 10 min. Otherwise it is a few seconds to retrieve the cache.
Generate the (default weekly) light curve:
if config.valid: wtl.plot(log=True, UTC=True);