Transcription

iii“sm2”2004/2/page iiSPECTRAL ANALYSIS OFSIGNALSPetre Stoica and Randolph MosesPRENTICE HALL, Upper Saddle River, New Jersey 07458iiii

iii“sm2”2004/2/2page iiiLibrary of Congress Cataloging-in-Publication DataSpectral Analysis of Signals/Petre Stoica and Randolph Mosesp. cm.Includes bibliographical references index.ISBN 0-13-113956-81. Spectral theory (Mathematics) I. Moses, Randolph II. Title512’–dc21 2005QA814.G2700-055035CIPAcquisitions Editor: Tom RobbinsEditor-in-Chief: ?Assistant Vice President of Production and Manufacturing: ?Executive Managing Editor: ?Senior Managing Editor: ?Production Editor: ?Manufacturing Buyer: ?Manufacturing Manager: ?Marketing Manager: ?Marketing Assistant: ?Director of Marketing: ?Editorial Assistant: ?Art Director: ?Interior Designer: ?Cover Designer: ?Cover Photo: ?c 2005 by Prentice Hall, Inc.Upper Saddle River, New Jersey 07458All rights reserved. No part of this book maybe reproduced, in any form or by any means,without permission in writing from the publisher.Printed in the United States of America10 9 8 7 6 5 4 3 2 1ISBN arsonPearsonPearsonEducation LTD., LondonEducation Australia PTY, Limited, SydneyEducation Singapore, Pte. LtdEducation North Asia Ltd, Hong KongEducation Canada, Ltd., TorontoEducacion de Mexico, S.A. de C.V.Education - Japan, TokyoEducation Malaysia, Pte. Ltdiiii

iii“sm2”2004/2/page iiiiContents1 Basic Concepts1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . .1.2 Energy Spectral Density of Deterministic Signals . .1.3 Power Spectral Density of Random Signals . . . . .1.3.1 First Definition of Power Spectral Density . .1.3.2 Second Definition of Power Spectral Density .1.4 Properties of Power Spectral Densities . . . . . . . .1.5 The Spectral Estimation Problem . . . . . . . . . . .1.6 Complements . . . . . . . . . . . . . . . . . . . . . .1.6.1 Coherency Spectrum . . . . . . . . . . . . . .1.7 Exercises . . . . . . . . . . . . . . . . . . . . . . . .1134678121212142 Nonparametric Methods222.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222.2 Periodogram and Correlogram Methods . . . . . . . . . . . . . . . . 222.2.1 Periodogram . . . . . . . . . . . . . . . . . . . . . . . . . . . 222.2.2 Correlogram . . . . . . . . . . . . . . . . . . . . . . . . . . . 232.3 Periodogram Computation via FFT . . . . . . . . . . . . . . . . . . 252.3.1 Radix–2 FFT . . . . . . . . . . . . . . . . . . . . . . . . . . . 262.3.2 Zero Padding . . . . . . . . . . . . . . . . . . . . . . . . . . . 272.4 Properties of the Periodogram Method . . . . . . . . . . . . . . . . . 282.4.1 Bias Analysis of the Periodogram . . . . . . . . . . . . . . . . 282.4.2 Variance Analysis of the Periodogram . . . . . . . . . . . . . 322.5 The Blackman–Tukey Method . . . . . . . . . . . . . . . . . . . . . . 372.5.1 The Blackman–Tukey Spectral Estimate . . . . . . . . . . . . 372.5.2 Nonnegativeness of the Blackman–Tukey Spectral Estimate . 392.6 Window Design Considerations . . . . . . . . . . . . . . . . . . . . . 392.6.1 Time–Bandwidth Product and Resolution–Variance Tradeoffs in Window Design . . . . . . . . . . . . . . . . . . . . . . 402.6.2 Some Common Lag Windows . . . . . . . . . . . . . . . . . . 412.6.3 Window Design Example . . . . . . . . . . . . . . . . . . . . 452.6.4 Temporal Windows and Lag Windows . . . . . . . . . . . . . 472.7 Other Refined Periodogram Methods . . . . . . . . . . . . . . . . . . 482.7.1 Bartlett Method . . . . . . . . . . . . . . . . . . . . . . . . . 492.7.2 Welch Method . . . . . . . . . . . . . . . . . . . . . . . . . . 502.7.3 Daniell Method . . . . . . . . . . . . . . . . . . . . . . . . . . 522.8 Complements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 552.8.1 Sample Covariance Computation via FFT . . . . . . . . . . . 552.8.2 FFT–Based Computation of Windowed Blackman–Tukey Periodograms . . . . . . . . . . . . . . . . . . . . . . . . . . . . 572.8.3 Data and Frequency Dependent Temporal Windows: TheApodization Approach . . . . . . . . . . . . . . . . . . . . . . 59iiiiiii

iii“sm2”2004/2/page iviiv2.92.8.4 Estimation of Cross–Spectra and Coherency Spectra . . . . .2.8.5 More Time–Bandwidth Product Results . . . . . . . . . . . .Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3 Parametric Methods for Rational Spectra3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3.2 Signals with Rational Spectra . . . . . . . . . . . . . . . . . . . . .3.3 Covariance Structure of ARMA Processes . . . . . . . . . . . . . .3.4 AR Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3.4.1 Yule–Walker Method . . . . . . . . . . . . . . . . . . . . . .3.4.2 Least Squares Method . . . . . . . . . . . . . . . . . . . . .3.5 Order–Recursive Solutions to the Yule–Walker Equations . . . . .3.5.1 Levinson–Durbin Algorithm . . . . . . . . . . . . . . . . . .3.5.2 Delsarte–Genin Algorithm . . . . . . . . . . . . . . . . . . .3.6 MA Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3.7 ARMA Signals . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3.7.1 Modified Yule–Walker Method . . . . . . . . . . . . . . . .3.7.2 Two–Stage Least Squares Method . . . . . . . . . . . . . .3.8 Multivariate ARMA Signals . . . . . . . . . . . . . . . . . . . . . .3.8.1 ARMA State–Space Equations . . . . . . . . . . . . . . . .3.8.2 Subspace Parameter Estimation — Theoretical Aspects . .3.8.3 Subspace Parameter Estimation — Implementation Aspects3.9 Complements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3.9.1 The Partial Autocorrelation Sequence . . . . . . . . . . . .3.9.2 Some Properties of Covariance Extensions . . . . . . . . . .3.9.3 The Burg Method for AR Parameter Estimation . . . . . .3.9.4 The Gohberg–Semencul Formula . . . . . . . . . . . . . . .3.9.5 MA Parameter Estimation in Polynomial Time . . . . . . .3.10 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51171171181191221251294 Parametric Methods for Line Spectra1444.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1444.2 Models of Sinusoidal Signals in Noise . . . . . . . . . . . . . . . . . . 1484.2.1 Nonlinear Regression Model . . . . . . . . . . . . . . . . . . . 1484.2.2 ARMA Model . . . . . . . . . . . . . . . . . . . . . . . . . . . 1494.2.3 Covariance Matrix Model . . . . . . . . . . . . . . . . . . . . 1494.3 Nonlinear Least Squares Method . . . . . . . . . . . . . . . . . . . . 1514.4 High–Order Yule–Walker Method . . . . . . . . . . . . . . . . . . . . 1554.5 Pisarenko and MUSIC Methods . . . . . . . . . . . . . . . . . . . . . 1594.6 Min–Norm Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1644.7 ESPRIT Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1664.8 Forward–Backward Approach . . . . . . . . . . . . . . . . . . . . . . 1684.9 Complements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1704.9.1 Mean Square Convergence of Sample Covariances for LineSpectral Processes . . . . . . . . . . . . . . . . . . . . . . . . 1704.9.2 The Carathéodory Parameterization of a Covariance Matrix . 172iiii

iii“sm2”2004/2/page viv4.9.3Using the Unwindowed Periodogram for Sine Wave Detectionin White Noise . . . . . . . . . . . . . . . . . . . . . . . . . .4.9.4 NLS Frequency Estimation for a Sinusoidal Signal with TimeVarying Amplitude . . . . . . . . . . . . . . . . . . . . . . . .4.9.5 Monotonically Descending Techniques for Function Minimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4.9.6 Frequency-selective ESPRIT-based Method . . . . . . . . . .4.9.7 A Useful Result for Two-Dimensional (2D) Sinusoidal Signals4.10 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1741771791851931985 Filter Bank Methods2075.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2075.2 Filter Bank Interpretation of the Periodogram . . . . . . . . . . . . . 2105.3 Refined Filter Bank Method . . . . . . . . . . . . . . . . . . . . . . . 2125.3.1 Slepian Baseband Filters . . . . . . . . . . . . . . . . . . . . . 2135.3.2 RFB Method for High–Resolution Spectral Analysis . . . . . 2165.3.3 RFB Method for Statistically Stable Spectral Analysis . . . . 2185.4 Capon Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2225.4.1 Derivation of the Capon Method . . . . . . . . . . . . . . . . 2225.4.2 Relationship between Capon and AR Methods . . . . . . . . 2285.5 Filter Bank Reinterpretation of the Periodogram . . . . . . . . . . . 2315.6 Complements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2355.6.1 Another Relationship between the Capon and AR Methods . 2355.6.2 Multiwindow Interpretation of Daniell and Blackman–TukeyPeriodograms . . . . . . . . . . . . . . . . . . . . . . . . . . . 2385.6.3 Capon Method for Exponentially Damped Sinusoidal Signals 2415.6.4 Amplitude and Phase Estimation Method (APES) . . . . . . 2445.6.5 Amplitude and Phase Estimation Method for Gapped Data(GAPES) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2475.6.6 Extensions of Filter Bank Approaches to Two–DimensionalSignals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2515.7 Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2576 Spatial Methods6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . .6.2 Array Model . . . . . . . . . . . . . . . . . . . . . .6.2.1 The Modulation–Transmission–Demodulation6.2.2 Derivation of the Model Equation . . . . . .6.3 Nonparametric Methods . . . . . . . . . . . . . . . .6.3.1 Beamforming . . . . . . . . . . . . . . . . . .6.3.2 Capon Method . . . . . . . . . . . . . . . . .6.4 Parametric Methods . . . . . . . . . . . . . . . . . .6.4.1 Nonlinear Least Squares Method . . . . . . .6.4.2 Yule–Walker Method . . . . . . . . . . . . . .6.4.3 Pisarenko and MUSIC Methods . . . . . . . .6.4.4 Min–Norm Method . . . . . . . . . . . . . . .6.4.5 ESPRIT Method . . . . . . . . . . . . . . . . . . . . . . . .Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .263263265266268273276279281281283284285285iiii

iii“sm2”2004/2/page viivi6.56.6Complements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.5.1 On the Minimum Norm Constraint . . . . . . . . . . . . . . .6.5.2 NLS Direction-of-Arrival Estimation for a Constant-ModulusSignal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .6.5.3 Capon Method: Further Insights and Derivations . . . . . . .6.5.4 Capon Method for Uncertain Direction Vectors . . . . . . . .6.5.5 Capon Method with Noise Gain Constraint . . . . . . . . . .6.5.6 Spatial Amplitude and Phase Estimation (APES) . . . . . .6.5.7 The CLEAN Algorithm . . . . . . . . . . . . . . . . . . . . .6.5.8 Unstructured and Persymmetric ML Estimates of the Covariance Matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . .Exercises . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .286286288290294298305312317319APPENDICESA Linear Algebra and Matrix Analysis ToolsA.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . .A.2 Range Space, Null Space, and Matrix Rank . . . . . . .A.3 Eigenvalue Decomposition . . . . . . . . . . . . . . . . .A.3.1 General Matrices . . . . . . . . . . . . . . . . . .A.3.2 Hermitian Matrices . . . . . . . . . . . . . . . . .A.4 Singular Value Decomposition and Projection OperatorsA.5 Positive (Semi)Definite Matrices . . . . . . . . . . . . .A.6 Matrices with Special Structure . . . . . . . . . . . . . .A.7 Matrix Inversion Lemmas . . . . . . . . . . . . . . . . .A.8 Systems of Linear Equations . . . . . . . . . . . . . . . .A.8.1 Consistent Systems . . . . . . . . . . . . . . . . .A.8.2 Inconsistent Systems . . . . . . . . . . . . . . . .A.9 Quadratic Minimization . . . . . . . . . . . . . . . . . .328328328330331333336341345347347347350353B Cramér–Rao Bound ToolsB.1 Introduction . . . . . . . . . . . . . .B.2 The CRB for General Distributions .B.3 The CRB for Gaussian DistributionsB.4 The CRB for Line Spectra . . . . . .B.5 The CRB for Rational Spectra . . .B.6 The CRB for Spatial Spectra . . . .355355358359364365367C Model Order Selection ToolsC.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . .C.2 Maximum Likelihood Parameter Estimation . . . . . . .C.3 Useful Mathematical Preliminaries and Outlook . . . . .C.3.1 Maximum A Posteriori (MAP) Selection Rule . .C.3.2 Kullback-Leibler Information . . . . . . . . . . .C.3.3 Outlook: Theoretical and Practical PerspectivesC.4 Direct Kullback-Leibler (KL) Approach: No-Name Rule.377377378381382384385386.iiii

iii“sm2”2004/2/page viiiviiC.5C.6C.7C.8Cross-Validatory KL Approach: The AIC Rule . . . . . . .Generalized Cross-Validatory KL Approach: the GIC Rule .Bayesian Approach: The BIC Rule . . . . . . . . . . . . . .Summary and the Multimodel Approach . . . . . . . . . . .C.8.1 Summary . . . . . . . . . . . . . . . . . . . . . . . .C.8.2 The Multimodel Approach . . . . . . . . . . . . . .D Answers to Selected rences Grouped by Subject413Index420iiii

iii“sm2”2004/2/page viiiviiiiiii

iii“sm2”2004/2/page ixiList of ExercisesCHAPTER 11.1Scaling of the Frequency Axis1.2Time–Frequency Distributions1.3Two Useful Z–Transform Properties1.4A Simple ACS Example1.5Alt

2.8 Linear Transformation Interpretation of the DFT 2.9 For White Noise the Periodogram is an Unbiased PSD Estimator 2.10 Shrinking the Periodogram 2.11 Asymptotic Maximum Likelihood Estimation of (!) from p(!) 2.12 Plotting the Spectral Estimates in dB 2.13 Finite{Sample Variance/Covariance Analysis of the Periodogram