Data Quality Process - XMM-Newton
The Data Quality Process for XMM-Newton Data
When performing a search using the XSA, in the observation "Details" panel (click on the magnifying glass in the Results tab), one gets access to a data "Quality Report" . Here follows a short description on how it is generated.
A data quality process is being systematically applied to all XMM-Newton data since the beginning of the mission. Pipeline processing of all XMM-Newton ODFs is part of this process. The successful SAS processing of an ODF normally proves that it is "well formed", i.e. compliant with the formal specification.
Before pipeline processing, each ODF is also screened at the XMM-Newton Science Operations Centre and checked for pre-processing anomalies, such as attitude jitters, time reconstruction jumps, telemetry losses, housekeeping out-of-limits, counting mode in EPIC exposures, and a correct field acquisition for OM exposures.
The "formal compliance" does not imply that the scientific results can be trusted or that they reflect the status of the instrumental calibrations. For this reason the pipeline processing includes a coordinated international effort to systematically screen and verify the scientific quality of all the PPS products generated by the automatic processing pipeline. The result of this effort is summarized in Data Delivery Notes (DDNs), which are attached to the suite of pipeline products distributed through the XMM-Newton Science Archive.
The Quality Report linked from the XSA user interface combines the contents of the results of the ODF pre-processing screening and the pipeline DDNs.
The Data Quality Report lists the anomalies discovered during a given observation, in a verbose form, which should allow even non-experts to understand their meaning and implications. Nonetheless, if something in them remains obscure, you may consider contacting the XMM-Newton HelpDesk for advice.