Sciweavers

VTC
2006
IEEE

Suboptimal Maximum Likelihood Detection Using Gradient-based Algorithm for MIMO Channels

14 years 6 months ago
Suboptimal Maximum Likelihood Detection Using Gradient-based Algorithm for MIMO Channels
Abstract— This paper proposes a suboptimal maximum likelihood detection (MLD) algorithm for multiple-input multipleoutput (MIMO) communications. The proposed algorithm regards transmitted signals as continuous variables in the same way as a common method for the discrete optimization problem, and then searches candidates of the transmitted signals in the direction of a modified gradient vector of the metric. The vector enhances components in the gradient that are likely to cause the noise enhancement from which the zero-forcing (ZF) or minimum mean square error (MMSE) algorithms suffer. This method sets the initial guess to the solution by the ZF or MMSE algorithms, which can be recursively calculated. Also, the proposed algorithm requires the same complexity order as that of the ZF algorithm. Computer simulations demonstrate that it is superior in BER performance to conventional suboptimal algorithms of which complexity order is equal to that of ZF.
Thet Htun Khine, Kazuhiko Fukawa, Hiroshi Suzuki
Added 12 Jun 2010
Updated 12 Jun 2010
Type Conference
Year 2006
Where VTC
Authors Thet Htun Khine, Kazuhiko Fukawa, Hiroshi Suzuki
Comments (0)