Nowadays, a vast number of processes, from the mechanisms that regulate human heart rate variability to the performance of manufacturing machines, are monitored and evaluated using signals collected by sensors.
Given the importance of producing an accurate analysis of such information, the choice of an appropriate signal processing technique becomes particularly important.
Generally, signal processing requires the transformation of a time-domain signal into another domain (typically frequency-domain) with the aim of extracting features of interest embedded within the time series that are otherwise not observable.
From a mathematical point of view, the signal can be represented as a series of coefficients obtained from the comparison between the time-domain signal and a set of known functions: the higher the similarity between the signal and the function, the higher the associated coefficient will be.
While straightforward in principle, this method brings up the essential question of which template functions to choose to better describe a particular signal without losing fundamental information. The answer would be, of course: “It depends on the signal.”
If the time series is stationary (e.g. white noise) or at least “weakly stationary”, for which at least mean and variance are constant, then the Fourier transform technique is the first tool to use.
The Fourier transform is the most widely applied signal processing technique in science and engineering. The reason is that, according to its inventor (Fourier, 1822 ):
“An arbitrary function, continuous or with discontinuities, defined in a finite interval by an arbitrary capricious graph can always be expressed as a sum of sinusoids”.Therefore, the Fourier transform is essentially a convolution between the signal and a series of sine and cosine functions that can be viewed as template functions. Measuring the similarity between the signal and the template functions, this technique allows computation of the average frequency information during the entire period analysed. However, it does not maintain information about how the signal’s frequency contents vary with time (e.g. the output does not reveal if a particular component is present throughout the time of observation or only at certain intervals).
Therefore, the Fourier transform technique is not suited for analysing non stationary signals where the temporal structure is important.
While Fourier transforms are still widely applied,
”Experience with real-world data, however, soon convinces one that both stationarity and Gaussianity are fairy tales invented for the amusement of undergraduates” (Thomson, 1994 )Therefore, a new signal processing technique, able to deal with non stationarity, is necessary.
A first, and straightforward solution is to perform a “time-localised Fourier transform” or “Short-time Fourier transform” (STFT). This method, introduced by D. Gabor in 1946 , consists of analysing the time series using a window of a certain length that shifts through the signal along the time axis and performs localised Fourier transforms. The underlying assumption is that the signal, within the window function, can be considered stationary. Again the similarity between the signal and the time-shifted and frequency-modulated window function is measured, this time maintaining information about the temporal structure. While broadly adopted for different types of applications (e.g. transient signals and narrowband random signals) the STFT limitations are considerable: the choice of the window function directly affects the time and frequencies results of the analysis and the time and frequency resolutions cannot be chosen arbitrarily at the same time. As the frequency content of a signal is generally not known a priori, the choice of the right window size for the decompositions using the STFT is not always guaranteed. For this reason, new methods have been researched to obtain better processing of non stationary signals: the wavelet transform.
Wavelets, as a set of rectangular basis functions were firstly introduced by H. Haar in 1909. However, very few advancements were carried on in this field until the physicist Levy in the 1930s investigated the Brownian motions using signal processing techniques. Levy found out that the Haar basis function can be scaled into different intervals, providing higher precision in the modelling than the Fourier basis one. While from the 1930s to the 1970s a large number of people contributed to the wavelet research field, a major breakthrough occurred only in 1984. In fact, during this year J. Morlet and A. Grossman finally proposed a theoretical formulation for the transform.
The major benefit of the wavelet transform, in contrast to the STFT, is that the wavelet enables variable window sizes in analysing different frequency components within a signal.
To obtain the decomposition of the signal, the time series is compared with a set of template functions obtained from scaling and shift of a base function (the wavelet). The wavelet transform is capable of extracting the constituent features of a signal over its entire spectrum, by using small scales for the high frequency components and large scales for the low frequency analysis. The 1990s brought the multiresolution analysis that set up the success of the wavelet in signal processing analysis. Using multiresolution analysis to design the scaling function of the wavelet, other researchers were able to construct their own base wavelets (e.g. Daubechies wavelets family for digital filtering) allowing the proliferation of the wavelet transform in many fields like image processing and signal analysis.
In conclusion, given the higher versatility and precision in modelling time series, the wavelet transform is the preferable choice for the analysis of experimentally measured signals, where the temporal structure and the frequency components are unknown.
Thanks to Andrew Simmons and Nicola Pastorello for their help and useful suggestions/corrections.