作者: Wolfram-M. Lippe , Kristina Davoian
DOI:
关键词:
摘要: In this paper we study how the parallelization of a learning algorithm affects generalization ability Evolutionary Artificial Neural Networks (EANNs). The newly proposed evolutionary (EA), which improves chromosomes according to characteristics their genotype and phenotype, was used for evolving ANNs. EA has been parallelized by two schemes: migration approach, periodically exchanges best individuals between all parallel populations, recently developed migration-strangers strategy, extends search space during evolution replacement worst in populations with randomly generated new ones, called strangers. experiments have provided on Mackey-Glass chaotic time series problem order determine average prediction errors training testing data small large ANNs, evolved both algorithms (PEAs). results showed that PEAs enable produce compact ANNs high precision prediction, insignificant distinctions errors.