Abstract
The Least Mean Square (LMS) algorithm inherits slow convergence due to its dependency on the eigenvalue spread of the input correlation matrix. In this work, we resolve this problem by developing a novel variant of the LMS algorithms based on the q-derivative concept. The q-gradient is an extension of the classical gradient vector based on the concept of Jacksons derivative. Here, we propose to minimize the LMS cost function by employing the concept of q-derivative instead of the convent ional derivative. Thanks to the fact that the q-derivative takes larger steps in the search direction as it evaluates the secant of the cost function rather than the tangent (as in the case of a conventional derivative), we show that the q-derivative gives faster convergence for q>1 when compared to the conventional derivative. Then, we present a thorough investigation of the convergence behavior of the proposed q-LMS algorithm and carry out different analyses to assess its performance. Consequently, new explicit closed-form expressions for the mean-square-error (MSE) behavior are derived. Simulation results are presented to corroborate our theoretical findings.
Original language | English |
---|---|
Pages (from-to) | 50-60 |
Number of pages | 11 |
Journal | Signal Processing |
Volume | 111 |
DOIs | |
State | Published - Jun 2015 |
Bibliographical note
Publisher Copyright:© 2014 Elsevier B.V. All rights reserved.
Keywords
- Adaptive filters
- LMS algorithm
- Steady-state analysis
- Transient analysis
- q-LMS algorithm
- q-gradient
ASJC Scopus subject areas
- Control and Systems Engineering
- Software
- Signal Processing
- Computer Vision and Pattern Recognition
- Electrical and Electronic Engineering