Improving the convergence of backpropagation by opposite transfer functions

Mario Ventresca, Hamid R. Tizhoosh

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The backpropagation algorithm is a very popular approach to learning in feed-forward multi-layer perceptron networks. However, in many scenarios the time required to adequately learn the task is considerable. Many existing approaches have improved the convergence rate by altering the learning algorithm. We present a simple alternative approach inspired by opposition-based learning that simultaneously considers each network transfer function and its opposite. The effect is an improvement in convergence rate and over traditional backpropagation learning with momentum. We use four common benchmark problems to illustrate the improvement in convergence time.

Original languageEnglish (US)
Title of host publicationInternational Joint Conference on Neural Networks 2006, IJCNN '06
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages4777-4784
Number of pages8
ISBN (Print)0780394909, 9780780394902
DOIs
StatePublished - 2006
EventInternational Joint Conference on Neural Networks 2006, IJCNN '06 - Vancouver, BC, Canada
Duration: Jul 16 2006Jul 21 2006

Publication series

NameIEEE International Conference on Neural Networks - Conference Proceedings
ISSN (Print)1098-7576

Conference

ConferenceInternational Joint Conference on Neural Networks 2006, IJCNN '06
Country/TerritoryCanada
CityVancouver, BC
Period7/16/067/21/06

ASJC Scopus subject areas

  • Software

Fingerprint

Dive into the research topics of 'Improving the convergence of backpropagation by opposite transfer functions'. Together they form a unique fingerprint.

Cite this