Publication Date:
2006
abstract:
An extension of Cellular Genetic Programming for data
classification (CGPC) to induce an ensemble of predictors is
presented. Two algorithms implementing the bagging and boosting
techniques are described and compared with CGPC. The approach is
able to deal with large data sets that do not fit in main memory
since each classifier is trained on a subset of the overall
training data. The predictors are then combined to classify new
tuples. Experiments on several data sets show that, by using a
training set of reduced size, better classification accuracy can
be obtained, but at a much lower computational cost.
Iris type:
01.01 Articolo in rivista
Keywords:
data mining; genetic programming; classification; bagging; boosting
List of contributors:
Pizzuti, Clara; Spezzano, Giandomenico; Folino, Gianluigi
Published in: