Numerical condition of feedforward networks with opposite transfer functions

Mario Ventresca, Hamid Reza Tizhoosh

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Numerical condition affects the learning speed and accuracy of most artificial neural network learning algorithms. In this paper, we examine the influence of opposite transfer functions on the conditioning of feedforward neural network architectures. The goal is not to discuss a new training algorithm nor error surface geometry, but rather to present characteristics of opposite transfer functions which can be useful for improving existing or to develop new algorithms. Our investigation examines two situations: (1) network initialization, and (2) early stages of the learning process. We provide theoretical motivation for the consideration of opposite transfer functions as a means to improve conditioning during these situations. These theoretical results are validated by experiments on a subset of common benchmark problems. Our results also reveal the potential for opposite transfer functions in other areas of, and related to neural networks.

Original languageEnglish (US)
Title of host publication2008 International Joint Conference on Neural Networks, IJCNN 2008
Pages3233-3240
Number of pages8
DOIs
StatePublished - 2008
Event2008 International Joint Conference on Neural Networks, IJCNN 2008 - Hong Kong, China
Duration: Jun 1 2008Jun 8 2008

Publication series

NameProceedings of the International Joint Conference on Neural Networks

Conference

Conference2008 International Joint Conference on Neural Networks, IJCNN 2008
Country/TerritoryChina
CityHong Kong
Period6/1/086/8/08

Keywords

  • Feedforward
  • Ill-conditioning
  • Numerical condition
  • Opposite transfer functions

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Numerical condition of feedforward networks with opposite transfer functions'. Together they form a unique fingerprint.

Cite this