Last modified: 14 December 2012

URL: http://cxc.harvard.edu/csc/proc/cal.html

Calibrate Pipeline


The calibrate pipeline is run for each observation interval (OBI) chosen by the observation selection process to ensure that the most recent calibration data are used for catalog construction.

Reprocess the datasets with the same CALDB

A new bad pixel file is created to ensure that all observation-specific hot pixels and afterglow events are included.

The OBI is then reprocessed to apply the latest calibration - gain, ACIS time-dependent gain, ACIS charge transfer inefficiency (CTI), HRC degap, etc. A unique CALDB is used for catalog processing; it is frozen for the duration of the processing run so that the same calibration is applied to all the datasets in that version of the catalog. The same processing tools are used as in the Chandra Standard Data Processing pipeline: acis_process_events for ACIS data and hrc_process_events for HRC data.

Standard filtering is applied to the data to apply the good time intervals (GTIs) and remove bad events with bad grades or status flags. ACIS datasets are destreaked. Average dead time corrections are calculated for HRC datasets.

Identify and remove background flares

A histogram of the data is created for each chip, then the mean and standard deviation of the histogram values are determined. All pixels which have a value greater than (mean+3*stdev) are considered source pixels.

A light curve is created with the Gregory-Loredo algorithm. The light curve minimum is calculated, and a threshold value is set at 10*min. For each chip, the times when the ratio of the light curve count rate is greater than the threshold are found and excluded from further catalog analysis.

The GTIs are revised to exclude those periods of background flares.

Output data products

The calibrate pipeline produces these data products: