Combining Evolving Neural Network Classifiers Using Bagging

Sunghwan Sohn, Cihan H. Dagli

Research output: Contribution to conferencePaper

4 Citations (Scopus)

Abstract

The performance of the neural network classifier significantly depends on its architecture and generalization. It is usual to find the proper architecture by trial and error. This is time consuming and may not always find the optimal network. For this reason, we apply genetic algorithms to the automatic generation of neural networks. Many researchers have provided that combining multiple classifiers improves generalization. One of the most effective combining methods is bagging. In bagging, training sets are selected by resampling from the original training set and classifiers trained with these sets are combined by voting. We implement the bagging technique into the training of evolving neural network classifiers to improve generalization.

Original languageEnglish (US)
Pages3218-3222
Number of pages5
StatePublished - Sep 25 2003
Externally publishedYes
EventInternational Joint Conference on Neural Networks 2003 - Portland, OR, United States
Duration: Jul 20 2003Jul 24 2003

Other

OtherInternational Joint Conference on Neural Networks 2003
CountryUnited States
CityPortland, OR
Period7/20/037/24/03

Fingerprint

Classifiers
Neural networks
Genetic algorithms

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Sohn, S., & Dagli, C. H. (2003). Combining Evolving Neural Network Classifiers Using Bagging. 3218-3222. Paper presented at International Joint Conference on Neural Networks 2003, Portland, OR, United States.

Combining Evolving Neural Network Classifiers Using Bagging. / Sohn, Sunghwan; Dagli, Cihan H.

2003. 3218-3222 Paper presented at International Joint Conference on Neural Networks 2003, Portland, OR, United States.

Research output: Contribution to conferencePaper

Sohn, S & Dagli, CH 2003, 'Combining Evolving Neural Network Classifiers Using Bagging', Paper presented at International Joint Conference on Neural Networks 2003, Portland, OR, United States, 7/20/03 - 7/24/03 pp. 3218-3222.
Sohn S, Dagli CH. Combining Evolving Neural Network Classifiers Using Bagging. 2003. Paper presented at International Joint Conference on Neural Networks 2003, Portland, OR, United States.
Sohn, Sunghwan ; Dagli, Cihan H. / Combining Evolving Neural Network Classifiers Using Bagging. Paper presented at International Joint Conference on Neural Networks 2003, Portland, OR, United States.5 p.
@conference{515cedc07b6a45ea81bd321c8f8ec7ef,
title = "Combining Evolving Neural Network Classifiers Using Bagging",
abstract = "The performance of the neural network classifier significantly depends on its architecture and generalization. It is usual to find the proper architecture by trial and error. This is time consuming and may not always find the optimal network. For this reason, we apply genetic algorithms to the automatic generation of neural networks. Many researchers have provided that combining multiple classifiers improves generalization. One of the most effective combining methods is bagging. In bagging, training sets are selected by resampling from the original training set and classifiers trained with these sets are combined by voting. We implement the bagging technique into the training of evolving neural network classifiers to improve generalization.",
author = "Sunghwan Sohn and Dagli, {Cihan H.}",
year = "2003",
month = "9",
day = "25",
language = "English (US)",
pages = "3218--3222",
note = "International Joint Conference on Neural Networks 2003 ; Conference date: 20-07-2003 Through 24-07-2003",

}

TY - CONF

T1 - Combining Evolving Neural Network Classifiers Using Bagging

AU - Sohn, Sunghwan

AU - Dagli, Cihan H.

PY - 2003/9/25

Y1 - 2003/9/25

N2 - The performance of the neural network classifier significantly depends on its architecture and generalization. It is usual to find the proper architecture by trial and error. This is time consuming and may not always find the optimal network. For this reason, we apply genetic algorithms to the automatic generation of neural networks. Many researchers have provided that combining multiple classifiers improves generalization. One of the most effective combining methods is bagging. In bagging, training sets are selected by resampling from the original training set and classifiers trained with these sets are combined by voting. We implement the bagging technique into the training of evolving neural network classifiers to improve generalization.

AB - The performance of the neural network classifier significantly depends on its architecture and generalization. It is usual to find the proper architecture by trial and error. This is time consuming and may not always find the optimal network. For this reason, we apply genetic algorithms to the automatic generation of neural networks. Many researchers have provided that combining multiple classifiers improves generalization. One of the most effective combining methods is bagging. In bagging, training sets are selected by resampling from the original training set and classifiers trained with these sets are combined by voting. We implement the bagging technique into the training of evolving neural network classifiers to improve generalization.

UR - http://www.scopus.com/inward/record.url?scp=0141480175&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0141480175&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:0141480175

SP - 3218

EP - 3222

ER -