Create, from FT1 and FT2, a compact data set with photon and livetime info.

Overview

Fermi-LAT weekly data files are extracted from the GSFC FTP server , with subfolders for the photon data, photon and spacecraft data, spacecraft. It is described here

The class FermiData downloads these to temporary files and constructs a dict for each week with contents

  • photons: a table, one entry per selected photon with columns, converted with get_ft1_data

    • run_ref (uint8), index into the runs list
    • time since the run start, in 2 $\mu$s intervals (uint32)
    • energy and event type (uint8)
    • position as HEALPix index with nside, nest from Config, currently 1024, True (uint32)
  • sc_data: a table, one entry per 30-s interval, with columns, all float32, converted with get_ft2_info

    • start/stop time
    • S/C direction
    • zenith direction
  • runs list of run numbers
  • gti_times: an array of interleaved start/stop intervals
  • file_date: modification date for the FT1 file at GSFC.

These dict objects are saved in a folder specfied by Config().datapath. Its contents can be checked with check_data, and updated using update_data. The contents of a week can be visually inspected with plot_week.

Run numbers and timing

The run number is an integer correspnding to the MET at the start of the run. For a week, with 15 orbits/day, we expect ~105 runs. We save a table of the run numbers per week, and use a uint8 index in the photon table.

A run is typically an orbit or less, at most 6 ks. Integerizing the offset from the run start, or the run number, using 32 bits, one has 5e5 intervals/s, so we choose 2$\mu$s. Thus we generate the time for each event from three sources: the 2$\mu$s time offset for the run, the index of the run number into the runs table, and the runs table itself.

GTI

GTI, for Good Time Interval, is a mechanism to select event times, and corresponding spacecraft information if exposure is needed. There is a section for this information in the FT1 photon data files, but, for the GSFC-generated weekly files we use here, we do not use it, since the photon data itself respects it, as do the space craft information in the FT2 files. This code is intended to support external use.

class GTI[source]

GTI(times:interleaved (start,stop) times in MJD, name:str name to associate='', **kwargs)

Define a functor that returns acceptability of times Note, internally use MJD time scale

GTI.__call__[source]

GTI.__call__(times:array of MJD times)

return array of bool for accepability of times

Apply an external GTI

Copied the files nogrb.gti and nosolorflares.gti from /nfs/farm/g/glast/g/catalog/P8_P305 to config.datapath

display_markdown("""\
## FT1 and FT2 -- data access

Fermi data is distributed in two FITS file formats:
- FT1: photon data
- FT2: spacecraft history
""")

FT1 and FT2 -- data access

Fermi data is distributed in two FITS file formats:

  • FT1: photon data
  • FT2: spacecraft history

get_ft1_data[source]

get_ft1_data(config, ft1_file)

Read in a photon data (FT1) file, bin in energy and position to convert to a compact table

  • ft1_file -- A weekly file from GSFC

Depends on config items

  • theta_cut, z_cut -- selection criteria
  • ebins, etypes -- define band index
  • nside, nest -- define HEALPix binning

Returns a dict with keys

  • tstart, the start MET time from the FT1 header

  • photons: a dict with keys and entries for each selected photon

    • band (uint8): energy band index*2 + 0,1 for Front/Back
    • nest_index if nest else ring_index (uint32): HEALPIx index for the nside
    • run_ref (uint8) reference to the run number, in the array runs
    • trun (unit32): time since the run id in 2 $\mu s$ units
  • gti_times -- GTI times as an interleaved start, stop array.

  • runs -- a list of the run numbers, each a MET time. Expect ~109 per week

For the selected events above 100 MeV, this represents 10 bytes per photon, vs. 27 in the FT1 data

get_ft2_info[source]

get_ft2_info(config, filename, gti=<lambda>)

Process a FT2 file, with S/C history data, and return a summary dict

Parameters:

  • config -- verbose, cos_theta_max, z_max
  • filename -- spacecraft (FT2) file
  • gti -- GTI object that checkes for allowed intervals, in MJD units

Returns: A dict with fields consistent with GTI if specified

  • start, stop -- interval in MJD units
  • livetime -- sec
  • ra_scz, dec_scz --spaceraft direction
  • ra_zenith, dec_zenith -- local zenith

FermiData

With superclass GSFCweekly, organize the weekly FT1 and FT2 files from GSFC into local weekly summary files

display_markdown( """### Test download from GSFC and data handling
Download, then process FT1 and FT2 files.
""")
gd = GSFCweekly(Config(verbose=3))
if gd.config.valid:
    ft2, ft1 = gd.download(-2)
    v1 = get_ft1_data(gd.config, Path(gd.local_path)/ft1);

    print(f'Contents of FT1 summary output dict\n'
          ' key        value type')
    for (k,x) in v1.items():
        print(f'  {k:10} {x.__class__.__name__}')
    
    v2 = get_ft2_info(gd.config, Path(gd.local_path)/ft2)
    print(f'Contents of FT2 summary output dict\n'
          ' key        value type')
    for (k,x) in v2.items():
        print(f'  {k:10} {x.__class__.__name__}')

Test download from GSFC and data handling

Download, then process FT1 and FT2 files.

GSFCweekly: spacecraft/lat_spacecraft_weekly_w751_p310_v001.fits --> week751_ft2.fits
GSFCweekly: photon/lat_photon_weekly_w751_p305_v001.fits --> week751_ft1.fits
FT1: week751_ft1.fits, GTI range 687917276.7-688520848.1:  1,986,768 photons
	Selection E > 100 MeV. theta<66.4 and z<100 remove: 82.50%
FT1 data from file week751_ft1.fits: tstart=687917276 (UTC 2022-10-20) selected 347,624/1,986,768 photons, in 107 runs.
Contents of FT1 summary output dict
 key        value type
  tstart     float
  photons    dict
  gti_times  ndarray
  runlist    ndarray
FT2: week751_ft2.fits, MET range 687917276.7-688520848.1, 17263 entries, 17263 (100.0%) in GTI
Contents of FT2 summary output dict
 key        value type
  start      ndarray
  stop       ndarray
  livetime   ndarray
  ra_scz     ndarray
  dec_scz    ndarray
  ra_zenith  ndarray
  dec_zenith ndarray
gti = v1['gti_times']
gti-gti[0]
g = np.diff(gti)[1::2]
b = np.diff(gti)[0::2]
gti[-1]-gti[0],pd.Series(g).describe(), pd.Series(b).describe(), np.sum(g), np.sum(b)
(603571.4286704063,
 count     106.000000
 mean      823.182803
 std       716.703561
 min        13.564546
 25%        13.582048
 50%      1101.074372
 75%      1484.577808
 max      1818.581802
 dtype: float64,
 count     107.000000
 mean     4825.364968
 std       936.300554
 min       224.607790
 25%      4398.919107
 50%      4796.432692
 75%      5686.417211
 max      5956.435452
 dtype: float64,
 87257.3771084547,
 516314.05156195164)

class FermiData[source]

FermiData(config=None) :: GSFCweekly

Manage the full data set in weekly chunks

  • Checking the current set of files at GSFC
  • downloading a week at a time to a local tmp
  • Converting to condensed format and saving to pickled dicts in wtlike_data

GSFCweekly.download[source]

GSFCweekly.download(week)

Download the given week's files to the local folder

week -- the mission week number, starting at 9. If negative, get recent one

return the ft1, ft2 local filenames

FermiData.needs_update[source]

FermiData.needs_update()

Compare files on disk with the GSFC list and compile list that need to be downloaded

Check the file date of the last one on disk and update it as well if it has a different filedate

FermiData.check_data[source]

FermiData.check_data()

Return: sorted list of summary files, last week number, number of days in last week

check_data[source]

check_data(config=None, update=False)

Print current status, and update if requested

update_data[source]

update_data(config=None)

Bring all of the local week data summaries up to date, downloading the missing ones from GSFC. If the last one is the current week, check to see if needs updating, by comparing file date, in days, from the last update with the current one at GSFC.

DataView

class DataView[source]

DataView(interval:tuple=(0, 0), config:Config | None=None, gti:GTI | None=None, nside:int=128, bmin:int=0)

Manage various views of the data set

Constructor selects a time range

- interval : a (start,stop) tuple (MJD). If (0,0), access all data
- gti : GTI
- nside [128] default HEALPix nside to use for maps
- bmin [0] minimum energy band index
display_markdown("""### Test GTI
Extract photon data for a week containing two solar flares, check that 
they get excluded.
""")
interval = (55990, 55995)
dv = DataView(interval, gti=GTI('nosolarflares.gti'))
print(dv)
wk =dv.photon_week(dv[0])
t = wk['time']
dv_nocut = DataView(interval)
wk_nocut = dv_nocut.photon_week(dv_nocut[0])
bins = np.linspace(t[0],t[-1], 100)
fig, ax = plt.subplots(figsize=(6,2))
ax.hist(wk_nocut['time'], bins=bins, color='lightgrey',label='all data');
ax.hist(t,bins=bins, color='green', label='selected');
ax.set(xlabel='MJD', xlim=interval)
ax.legend();

Test GTI

Extract photon data for a week containing two solar flares, check that they get excluded.

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
/tmp/ipykernel_3114/1863565176.py in <module>
      4 """)
      5 interval = (55990, 55995)
----> 6 dv = DataView(interval, gti=GTI('nosolarflares.gti'))
      7 print(dv)
      8 wk =dv.photon_week(dv[0])

/tmp/ipykernel_3114/1345195245.py in __init__(self, times, name, **kwargs)
     12         self.times = times
     13         self.name = name
---> 14         if np.any(np.diff(self.times)<0):
     15             raise ValueError( f'Bad input data from expect ascending time values.')
     16 

<__array_function__ internals> in diff(*args, **kwargs)

~/miniconda3-220301/envs/astro/lib/python3.9/site-packages/numpy/lib/function_base.py in diff(a, n, axis, prepend, append)
   1256     nd = a.ndim
   1257     if nd == 0:
-> 1258         raise ValueError("diff requires input that is at least one dimensional")
   1259     axis = normalize_axis_index(axis, nd)
   1260 

ValueError: diff requires input that is at least one dimensional
debug
> /home/burnett/miniconda3-220301/envs/astro/lib/python3.9/site-packages/numpy/lib/function_base.py(1258)diff()
   1256     nd = a.ndim
   1257     if nd == 0:
-> 1258         raise ValueError("diff requires input that is at least one dimensional")
   1259     axis = normalize_axis_index(axis, nd)
   1260 

> <__array_function__ internals>(5)diff()


display_markdown("""## PixelView
Manage the data as a FITS file accepted by pointlike

This class uses a DataView, or a FITS file to load a set of pixels with counts
for a time interval.

The FITS format has two HDUs with columns as shown.

__BANDS__ 
* NSIDE (J) -- power of 2 up to 4096
* E_MIN (D), E_MAX (D) 
* EVENT_TYPE (J) -- 0 or 1 for Front, Back

__SKYMAP__
* PIX (I) -- pixel index (RING) ordering now, but can be specified in header.
* CHANNEL (I) -- band index
* VALUE (J) -- number of events

Currently the wtlike photons are all stored with nside 1024, while the pointlike files 
go up to 4096, simply because that it the largest for 32 bits.


""")

class PixelView[source]

PixelView(dataview)

Define a pixel-based database, associating a total count with a set of pixels for each band. That, is make a view that is integrated over time.

pmap = pm.get_map(6,64)
pm.mollview(8, 32,max=4)
#pm.mollview(6); # second time works--mystery TODO
bands.dtype
nside = bands.NSIDE
e_min = bands.E_MIN
e_max = bands.E_MAX
dr3pfile = Path(os.path.expandvars(
    '$FERMI/pointlike-data/12years_zmax100_4bpd_v2.fits'))
assert dr3pfile.is_file()
hdus = fits.open(dr3pfile)
hdus.info()