Description
Frequency estimation of sinusoidal signals is a frequently addressed area of signal processing research since it is commonly used in applications such as radar, sonar, communications, and control theory. The signal is made up of a deterministic sinusoidal component and a noise component where the noise is assumed to be a zero mean Gaussian process with unknown variance. The goal is to estimate the frequency of the sinusoid given a finite number of samples from the signal. In this research a novel method of autocorrelation, called overlapping autocorrelation (OAC), is used to process the signal prior to applying the Modified Covariance (MC) and the Modified Pisarenko Harmonic Decomposition (MPHD) frequency estimation algorithms. The OAC removes the majority of the noise corruption usually found in traditional autocorrelation methods while maintaining the sinusoidal nature of the signal. Current frequency estimation algorithms either meet the Cramer Rao Lower Bound (CRLB) on the variance of an unbiased estimator and have a high computational cost or do not meet the CRLB and have a low computational cost. The proposed method uses the OAC to reduce the noise level of the signal and therefore reduce the mean squared error (MSE) of the estimator with little added computational cost. The OAC generates a signal with a lower noise level, but the result is a correlated signal so the CRLB does not change. The main computer simulations consisted of calculating the MSE of the estimated frequency while independently changing the SNR, frequency, and number of samples of the signal. These simulations show that the proposed methods have a smaller MSE than similar low computational cost algorithms at low and high SNRs with a significant improvement in low SNR scenarios. Other results show that for a wide range of frequencies and SNRs the optimization function which calculates the optimal autocorrelation lag for the MPHD method is not needed when the OAC is used to process the signal.