Wiley.com
Print this page Share

Adaptive Signal Processing: Next Generation Solutions

ISBN: 978-0-470-19517-8
Hardcover
424 pages
March 2010, Wiley-IEEE Press
List Price: US $161.25
Government Price: US $111.32
Enter Quantity:   Buy
Adaptive Signal Processing: Next Generation Solutions (0470195177) cover image
This is a Print-on-Demand title. It will be printed specifically to fill your order. Please allow an additional 10-15 days delivery time. The book is not returnable.

Preface xi

Contributors xv

Chapter 1 Complex-Valued Adaptive Signal Processing 1

1.1 Introduction 1

1.1.1 Why Complex-Valued Signal Processing 3

1.1.2 Outline of the Chapter 5

1.2 Preliminaries 6

1.2.1 Notation 6

1.2.2 Efficient Computation of Derivatives in the Complex Domain 9

1.2.3 Complex-to-Real and Complex-to-Complex Mappings 17

1.2.4 Series Expansions 20

1.2.5 Statistics of Complex-Valued Random Variables and Random Processes 24

1.3 Optimization in the Complex Domain 31

1.3.1 Basic Optimization Approaches in RN 31

1.3.2 Vector Optimization in CN 34

1.3.3 Matrix Optimization in CN 37

1.3.4 Newton-Variant Updates 38

1.4 Widely Linear Adaptive Filtering 40

1.4.1 Linear and Widely Linear Mean-Square Error Filter 41

1.5 Nonlinear Adaptive Filtering with Multilayer Perceptrons 47

1.5.1 Choice of Activation Function for the MLP Filter 48

1.5.2 Derivation of Back-Propagation Updates 55

1.6 Complex Independent Component Analysis 58

1.6.1 Complex Maximum Likelihood 59

1.6.2 Complex Maximization of Non-Gaussianity 64

1.6.3 Mutual Information Minimization: Connections to ML and MN 66

1.6.4 Density Matching 67

1.6.5 Numerical Examples 71

1.7 Summary 74

1.8 Acknowledgment 76

1.9 Problems 76

References 79

Chapter 2 Robust Estimation Techniques for Complex-Valued Random Vectors 87

2.1 Introduction 87

2.1.1 Signal Model 88

2.1.2 Outline of the Chapter 90

2.2 Statistical Characterization of Complex Random Vectors 91

2.2.1 Complex Random Variables 91

2.2.2 Complex Random Vectors 93

2.3 Complex Elliptically Symmetric (CES) Distributions 95

2.3.1 Definition 96

2.3.2 Circular Case 98

2.3.3 Testing the Circularity Assumption 99

2.4 Tools to Compare Estimators 102

2.4.1 Robustness and Influence Function 102

2.4.2 Asymptotic Performance of an Estimator 106

2.5 Scatter and Pseudo-Scatter Matrices 107

2.5.1 Background and Motivation 107

2.5.2 Definition 108

2.5.3 M-Estimators of Scatter 110

2.6 Array Processing Examples 114

2.6.1 Beamformers 114

2.6.2 Subspace Methods 115

2.6.3 Estimating the Number of Sources 118

2.6.4 Subspace DOA Estimation for Noncircular Sources 120

2.7 MVDR Beamformers Based on M-Estimators 121

2.7.1 The Influence Function Study 123

2.8 Robust ICA 128

2.8.1 The Class of DOGMA Estimators 129

2.8.2 The Class of GUT Estimators 132

2.8.3 Communications Example 134

2.9 Conclusion 137

2.10 Problems 137

References 138

Chapter 3 Turbo Equalization 143

3.1 Introduction 143

3.2 Context 144

3.3 Communication Chain 145

3.4 Turbo Decoder: Overview 147

3.4.1 Basic Properties of Iterative Decoding 151

3.5 Forward-Backward Algorithm 152

3.5.1 With Intersymbol Interference 160

3.6 Simplified Algorithm: Interference Canceler 163

3.7 Capacity Analysis 168

3.8 Blind Turbo Equalization 173

3.8.1 Differential Encoding 179

3.9 Convergence 182

3.9.1 Bit Error Probability 187

3.9.2 Other Encoder Variants 190

3.9.3 EXIT Chart for Interference Canceler 192

3.9.4 Related Analyses 194

3.10 Multichannel and Multiuser Settings 195

3.10.1 Forward-Backward Equalizer 196

3.10.2 Interference Canceler 197

3.10.3 Multiuser Case 198

3.11 Concluding Remarks 199

3.12 Problems 200

References 206

Chapter 4 Subspace Tracking for Signal Processing 211

4.1 Introduction 211

4.2 Linear Algebra Review 213

4.2.1 Eigenvalue Value Decomposition 213

4.2.2 QR Factorization 214

4.2.3 Variational Characterization of Eigenvalues Eigenvectors of Real Symmetric Matrices 215

4.2.4 Standard Subspace Iterative Computational Techniques 216

4.2.5 Characterization of the Principal Subspace of a Covariance Matrix from the Minimization of a Mean Square Error 218

4.3 Observation Model and Problem Statement 219

4.3.1 Observation Model 219

4.3.2 Statement of the Problem 220

4.4 Preliminary Example: Oja’s Neuron 221

4.5 Subspace Tracking 223

4.5.1 Subspace Power-Based Methods 224

4.5.2 Projection Approximation-Based Methods 230

4.5.3 Additional Methodologies 232

4.6 Eigenvectors Tracking 233

4.6.1 Rayleigh Quotient-Based Methods 234

4.6.2 Eigenvector Power-Based Methods 235

4.6.3 Projection Approximation-Based Methods 240

4.6.4 Additional Methodologies 240

4.6.5 Particular Case of Second-Order Stationary Data 242

4.7 Convergence and Performance Analysis Issues 243

4.7.1 A Short Review of the ODE Method 244

4.7.2 A Short Review of a General Gaussian Approximation Result 246

4.7.3 Examples of Convergence and Performance Analysis 248

4.8 Illustrative Examples 256

4.8.1 Direction of Arrival Tracking 257

4.8.2 Blind Channel Estimation and Equalization 258

4.9 Concluding Remarks 260

4.10 Problems 260

References 266

Chapter 5 Particle Filtering 271

5.1 Introduction 272

5.2 Motivation for Use of Particle Filtering 274

5.3 The Basic Idea 278

5.4 The Choice of Proposal Distribution and Resampling 289

5.4.1 Choice of Proposal Distribution 290

5.4.2 Resampling 291

5.5 Some Particle Filtering Methods 295

5.5.1 SIR Particle Filtering 295

5.5.2 Auxiliary Particle Filtering 297

5.5.3 Gaussian Particle Filtering 301

5.5.4 Comparison of the Methods 302

5.6 Handling Constant Parameters 305

5.6.1 Kernel-Based Auxiliary Particle Filter 306

5.6.2 Density-Assisted Particle Filter 308

5.7 Rao–Blackwellization 310

5.8 Prediction 314

5.9 Smoothing 316

5.10 Convergence Issues 320

5.11 Computational Issues and Hardware Implementation 323

5.12 Acknowledgments 324

5.13 Exercises 325

References 327

Chapter 6 Nonlinear Sequential State Estimation for Solving Pattern-Classification Problems 333

6.1 Introduction 333

6.2 Back-Propagation and Support Vector Machine-Learning Algorithms: Review 334

6.2.1 Back-Propagation Learning 334

6.2.2 Support Vector Machine 337

6.3 Supervised Training Framework of MLPs Using Nonlinear Sequential State Estimation 340

6.4 The Extended Kalman Filter 341

6.4.1 The EKF Algorithm 344

6.5 Experimental Comparison of the Extended Kalman Filtering Algorithm with the Back-Propagation and Support Vector Machine Learning Algorithms 344

6.6 Concluding Remarks 347

6.7 Problems 348

References 348

Chapter 7 Bandwidth Extension of Telephony Speech 349

7.1 Introduction 349

7.2 Organization of the Chapter 352

7.3 Nonmodel-Based Algorithms for Bandwidth Extension 352

7.3.1 Oversampling with Imaging 353

7.3.2 Application of Nonlinear Characteristics 353

7.4 Basics 354

7.4.1 Source-Filter Model 355

7.4.2 Parametric Representations of the Spectral Envelope 358

7.4.3 Distance Measures 362

7.5 Model-Based Algorithms for Bandwidth Extension 364

7.5.1 Generation of the Excitation Signal 365

7.5.2 Vocal Tract Transfer Function Estimation 369

7.6 Evaluation of Bandwidth Extension Algorithms 383

7.6.1 Objective Distance Measures 383

7.6.2 Subjective Distance Measures 385

7.7 Conclusion 388

7.8 Problems 388

References 390

Index 393

Back to Top