Nonlinear Least-Squares Time-Difference Estimation from Sub-Nyquist-Rate Samples

Koji HARADA  Hideaki SAKAI  

Publication
IEICE TRANSACTIONS on Fundamentals of Electronics, Communications and Computer Sciences   Vol.E95-A   No.7   pp.1117-1124
Publication Date: 2012/07/01
Online ISSN: 1745-1337
DOI: 10.1587/transfun.E95.A.1117
Print ISSN: 0916-8508
Type of Manuscript: PAPER
Category: Digital Signal Processing
Keyword: 
TDOA,  time-delay estimation,  time-difference estimation,  innovation-rate sampling,  sampling kernel,  

Full Text: PDF>>
Buy this Article




Summary: 
In this paper, time-difference estimation of filtered random signals passed through multipath channels is discussed. First, we reformulate the approach based on innovation-rate sampling (IRS) to fit our random signal model, then use the IRS results to drive the nonlinear least-squares (NLS) minimization algorithm. This hybrid approach (referred to as the IRS-NLS method) provides consistent estimates even for cases with sub-Nyquist sampling assuming the use of compactly-supported sampling kernels that satisfies the recently-developed nonaliasing condition in the frequency domain. Numerical simulations show that the proposed NLS-IRS method can improve performance over the straight-forward IRS method, and provides approximately the same performance as the NLS method with reduced sampling rate, even for closely-spaced time delays. This enables, given a fixed observation time, significant reduction in the required number of samples, while maintaining the same level of estimation performance.