Differences between MIRA- and MRTCAL-calibrated spectra are expected as a consequence of the different calibration bandwidths. One value of the calibration parameters are derived and applied per 1.35 GHz natural hardware unit when MIRA calibrates FTS200 spectra. In contrast, MRTCAL derives and applies the calibration in steps of 20 MHz (in the current default settings for the automatic online data processing; this value can be customized by the user).
The MRTCAL default is intended to improve the quality of the baseline,
as can be seen on Fig.
to
. The average spectra computed from the
same 15 minutes On-The-fly scan is displayed in each of the four
figures. The only difference between the four spectra are the way
MRTCAL calibrates the raw data.
, the calibration parameters
are derived and applied per natural hardware unit (i.e., every 1.35 GHz
as defined by the FTS units). This gives a staircase look to the spectra.
, the calibration parameters
are derived every 1.35 GHz. But they are linearly interpolated before
being applied. The staircase look is greatly decreased.
, the calibration parameters
are derived and applied every 20 MHz. No linear interpolation is done
before application. The staircase look and the baseline oscillations
disappeared. The atmospheric line around 110.8 GHz now appears in
absorption as it should because the reference position was localized
about 1 degree away from the OTF observations.
, the calibration
parameters are derived every 20 MHz and linearly interpolated before
being applied. The improvement with respect to the previous solution
exists even though it is not obvious on this plot.
.
|
|
|
|
|
|
|
|