We introduce an algorithm that simultaneously estimates a classification function as well as its gradient in the supervised learning framework. The motivation for the algorithm is...
Cyclic coordinate descent is a classic optimization method that has witnessed a resurgence of interest in machine learning. Reasons for this include its simplicity, speed and stab...
—Many real world learning problems can be recast as multi-task learning problems which utilize correlations among different tasks to obtain better generalization performance than...
Xi Chen, Weike Pan, James T. Kwok, Jaime G. Carbon...
We have recently introduced an incremental learning algorithm, Learn++ .NSE, for Non-Stationary Environments, where the data distribution changes over time due to concept drift. Le...
Convergence of blind delayed source separation algorithms, which use constant learning rates, is known to be slow. We propose a fuzzy logic based approach to adaptively select the...