Case Study 2 – The use of traceable in-situ calibration for the monitoring and correction of detector of sensor system defects

A challenge of atmospheric infrasound measurements is the wind-generated turbulent noise. One method to reduce this noise is to use a pipe-array Wind-Noise Reduction System (WNRS). This system samples the pressure field over an array that is large compared to the size of the turbulent eddies (noise), but small compared to the wavelength of the infrasound in the frequency band of interest (0.01-4 Hz). Therefore, the wind-noise is effectively averaged out, while the signal remains largely unaffected.

Although these systems work quite well, when defects are present, such as blocked inlets or flooded pipes, the infrasound response of the system can be affected, resulting in errors in the observed signal. A method to calibrate and monitor the status of these systems is by use of a co-located reference sensor, which is directly open to the atmosphere (without WNRS). Then by determining the ratio of the detector-under-test (DUT) to the reference for times when the two signals are highly coherent, we are able to use the Gabrielson method to determine the response of the DUT. This procedure is currently used to monitor the status of the detectors, but we will show that it can also be used to correct signal, such that the proper wave parameters (back azimuth and trace velocity) can be retrieved.

A temporary WNRS was installed at the same location as the H5 element at IS26, Germany. Several different defects were introduced to this temporary system, such that we could compare the measurements of different defective systems against that of the normally operating H5 element. By determining the response of the DUT, removing this response from the observed signal, we are able to retrieve the correct signal. The example results below are for a system with 24 out of the 32 inlets blocked with rubber stoppers.

 

The top panel shows the speed histogram as a function of the frequency using the normal IS26 array. The second panel is the difference of the speed histogram between the defective system and the normally operating one. There is an observed increase in the velocity at high frequency due to the change of the detector response caused by the defects. When we correct the signal, the results shown in the third panel are obtained, which is the difference between the corrected, defective system, and the normally operating one. To better see the improvement, the scale was set to the same as the second panel. There is almost no difference from the normally operating detector, demonstrating that the signal was properly corrected using the calibration results.

The in-situ calibration has been demonstrated to allow for the correction of signals when defects are present. This means that, when there are defects detected in the detectors, but they cannot be fixed right away, this procedure could be used to correct the signals, and retrieve the correct wave parameters. This could reduce the loss of detections, and improve measurement accuracy when defective detectors are identified.