When Swift telemetry is received from the Mission Operations Center (MOC) by the Swift Data Ceneter (SDC), it triggers a run of the Swift processing ipeline --- a detailed script of programs that produce FITS files from raw telemetry, calibrated event lists and cleaned images, and higher level science products such as light curves and spectra for all Swift instruments. Initial data products appear on the Swift Quick Look Data public Web site. When processing is complete, the products are delivered to the HEASARC archive. All pipeline software will be FTOOLs, that will also be distributed to Swift users. This way, users can then reprocess/reanalyze data when new calibration information is made available, instead of needing to wait for eventual reprocessing.
Software tools specific to the BAT, XRT and UVOT will apply instrument-specific calibration information and filtering criteria in the pipeline to arrive at calibrated images and screened event lists.
Figure 1 Top level flow diagram delineating the Swift processing Levels described here. Raw Swift telemetry is converted to Level 1 FITS files. Some processing is done to Level 1 files in place. Data screening and coordinate transformation result in Level 2 calibrated event lists and sky images. Standard product scripts are run on Level 2 files to produce spectra, light curves and combined sky images.
The UVOT instrument produces a finding chart which arrives via TDRSS, and event and image data taken through any one of eight broadband filters or two grism filters. During an observation, the size and location of the window can change, as can the on-chip binning.
The pipeline will produce cleaned, calibrated event list files, calibrated image files and standard products for each observation. This includes, e.g., high signal-to-noise images of the field generated by combining all individual images obtained using the same filter. Exposure maps are constructed for each combined image. Source lists are derived from the combined images. Provided an optical counterpart to the target has been identified, light curves for each available filter are extracted from all available image and event data.
When a reliable source position is available, grism spectra of the candidate counterpart are obtained from each available grism image. Grism source event tables will be generated, containing the wavelength of each photon, screened according to a spatial mask so that only those events likely to be associated with the candidate counterpart remain. A response matrix will will be generated to facilitate the analysis of grism data within Xspec. A response matrix will also be provided so that broad-band fluxes through the standard filters can be fit simultaneously with XRT and BAT data.
The standard event screening criteria for XRT will rely on event grade, detector temperature, spacecraft attitude and GTI information. XRT standard products---spectra, images and light curves---will be produced in the pipeline, as well as ARF files for spectroscopy, an exposure map appropriate for converting counts in an image to flux.
Computation of an appropriate exposure map for images will result in the net exposure time per pixel taking into account attitude reconstruction, spatial quantum efficiency, filter transmission, vignetting and FOV. For spectra, a Redistribution Matrix File (RMF) that specifies the channel probability distribution for a photon of a given energy and an Ancillary Response File (ARF) specifying telescope effective area and window absorption will be calculated in the pipeline.
For the entire BAT array, a one-second light curve will be produced, while light curves with 64 ms time resolution are produced for entire BAT blocks. Light curves with 1.6 s time resolution in four energy ranges are produced for each array quadrant. Light curves with 8-second time resolution in four energy bands record the maximum count rate in each of nine array regions, on 5 time scales. Mask-tagged light curves with 1.6 s time resolution are generated for each of three mask-tagged sources. Light curves of GRBs derived from event data and five-minute light curves derived from the survey data for each source detected by the BAT are also generated in the pipeline.
The pipeline produces BAT event files and Detector Plane Images (DPIs), needed to generate sky images. DPIs are histogram images of calibrated events, and must be deconvolved with the mask to produce useable sky images. Event files are rebinned to produce burst light curves and spectra. Photon spectra may be derived by fitting count spectra and can be corrected for the effects of partial coding and the reduced off-axis response. Detector response matrices are also calculated in the pipeline.
For BAT survey data, count spectra and response matrices will be produced and archived for sources found in the survey. During burst mode, count spectra and response matrices will be generated for bursts before, during and after the slew. Photon spectra before, during and after the slew may also be produced during burst mode. The pipeline will produce deconvolved sky images containing data from entire snapshots. Images will be available in 4 energy bands as well as a broadband image.
LAST MODIFIED: July 2004