Discrete Stochastic Processes and Optimal FilteringISBN: 978-1-905209-74-3
Hardcover
287 pages
May 2007, Wiley-ISTE
|
Preface xi
Introduction xiii
Chapter 1. Random Vectors 1
1.1. Definitions and general properties 1
1.2. Spaces L1(dP) and L2(dP) 20
1.2.1. Definitions 20
1.2.2. Properties 22
1.3. Mathematical expectation and applications 23
1.3.1. Definitions 23
1.3.2. Characteristic functions of a random vector 34
1.4. Second order random variables and vectors 39
1.5. Linear independence of vectors of L2(dP) 47
1.6. Conditional expectation (concerning random vectors with density function) 51
1.7. Exercises for Chapter 1 57
Chapter 2. Gaussian Vectors 63
2.1. Some reminders regarding random Gaussian vectors 63
2.2. Definition and characterization of Gaussian vectors 66
2.3. Results relative to independence 68
2.4. Affine transformation of a Gaussian vector 72
2.5. The existence of Gaussian vectors 74
2.6. Exercises for Chapter 2 85
Chapter 3. Introduction to Discrete Time Processes 93
3.1. Definition 93
3.2. WSS processes and spectral measure 105
3.2.1. Spectral density 106
3.3. Spectral representation of a WSS process 110
3.3.1. Problem 110
3.3.2. Results 111
3.3.2.1. Process with orthogonal increments and associated measurements 111
3.3.2.2. Wiener stochastic integral 113
3.3.2.3. Spectral representation 114
3.4. Introduction to digital filtering 115
3.5. Important example: autoregressive process 128
3.6. Exercises for Chapter 3 134
Chapter 4. Estimation 141
4.1. Position of the problem 141
4.2. Linear estimation 144
4.3. Best estimate – conditional expectation 156
4.4. Example: prediction of an autoregressive process AR (1) 165
4.5. Multivariate processes 166
4.6. Exercises for Chapter 4 175
Chapter 5. The Wiener Filter 181
5.1. Introduction 181
5.1.1. Problem position 182
5.2. Resolution and calculation of the FIR filter 183
5.3. Evaluation of the least error 185
5.4. Resolution and calculation of the IIR filter 186
5.5. Evaluation of least mean square error 190
5.6. Exercises for Chapter 5 191
Chapter 6. Adaptive Filtering: Algorithm of the Gradient and the LMS 197
6.1. Introduction 197
6.2. Position of problem 199
6.3. Data representation 202
6.4. Minimization of the cost function 204
6.4.1. Calculation of the cost function 208
6.5. Gradient algorithm 211
6.6. Geometric interpretation 214
6.7. Stability and convergence 218
6.8. Estimation of gradient and LMS algorithm 222
6.8.1. Convergence of the algorithm of the LMS 225
6.9. Example of the application of the LMS algorithm 225
6.10. Exercises for Chapter 6 234
Chapter 7. The Kalman Filter 237
7.1. Position of problem 237
7.2. Approach to estimation 241
7.2.1. Scalar case 241
7.2.2. Multivariate case 244
7.3. Kalman filtering 245
7.3.1. State equation 245
7.3.2. Observation equation 246
7.3.3. Innovation process 248
7.3.4. Covariance matrix of the innovation process 248
7.3.5. Estimation 250
7.3.6. Riccati’s equation 258
7.3.7. Algorithm and summary 260
7.4. Exercises for Chapter 7 262
Table of Symbols and Notations 281
Bibliography 283
Index 285