作者: Ajith Abraham , None
DOI: 10.1016/S0925-2312(03)00369-2
关键词:
摘要: Abstract In this paper, we present meta-learning evolutionary artificial neural network (MLEANN), an automatic computational framework for the adaptive optimization of networks (ANNs) wherein architecture, activation function, connection weights; learning algorithm and its parameters are adapted according to problem. We explored performance MLEANN conventionally designed ANNs function approximation problems. To evaluate comparative performance, used three different well-known chaotic time series. also state-of-the-art popular algorithms some experimentation results related convergence speed generalization performance. backpropagation algorithm; conjugate gradient algorithm, quasi-Newton Levenberg–Marquardt Performances were evaluated when functions architecture changed. further theoretical background, design strategy demonstrate how effective inevitable is proposed a network, which smaller, faster with better