Sciweavers

IJCNN
2008
IEEE

Numerical condition of feedforward networks with opposite transfer functions

14 years 5 months ago
Numerical condition of feedforward networks with opposite transfer functions
— Numerical condition affects the learning speed and accuracy of most artificial neural network learning algorithms. In this paper, we examine the influence of opposite transfer functions on the conditioning of feedforward neural network architectures. The goal is not to discuss a new training algorithm nor error surface geometry, but rather to present characteristics of opposite transfer functions which can be useful for improving existing or to develop new algorithms. Our investigation examines two situations: (1) network initialization, and (2) early stages of the learning process. We provide theoretical motivation for the consideration of opposite transfer functions as a means to improve conditioning during these situations. These theoretical results are validated by experiments on a subset of common benchmark problems. Our results also reveal the potential for opposite transfer functions in other areas of, and related to neural networks.
Mario Ventresca, Hamid R. Tizhoosh
Added 31 May 2010
Updated 31 May 2010
Type Conference
Year 2008
Where IJCNN
Authors Mario Ventresca, Hamid R. Tizhoosh
Comments (0)