Scalable parameter encoding of artificial neural networks obtained via an evolutionary process

作者: Felipe Petroski Such , Jeffrey Michael Clune , Kenneth Owen Stanley , Edoardo Conti , Vashisht Madhavan

DOI:

关键词:

摘要: A source system initializes, using an initialization seed, a first parameter vector representing weights of a neural network. The source system determines a second parameter vector by …

参考文章(51)
Juyang Weng, Yuekai Wang, Xiaofeng Wu, Synapse maintenance in the developmental networks ,(2013)
Diederik P. Kingma, Jimmy Ba, Adam: A Method for Stochastic Optimization arXiv: Learning. ,(2014)
Yoshua Bengio, Xavier Glorot, Understanding the difficulty of training deep feedforward neural networks international conference on artificial intelligence and statistics. pp. 249- 256 ,(2010)
Sue Ellen Haupt, Randy L. Haupt, Practical Genetic Algorithms ,(2004)
Arun Nair, Charles Beattie, Alessandro De Maria, Rory Fearon, Cagdas Alcicek, Vedavyas Panneershelvam, David Silver, Stig Petersen, Mustafa Suleyman, Sam Blackwell, Praveen Srinivasan, Volodymyr Mnih, Koray Kavukcuoglu, Shane Legg, Massively Parallel Methods for Deep Reinforcement Learning arXiv: Learning. ,(2015)
Bernhard Sendhoff, Yaochu Jin, Approximate fitness functions ,(2001)
Chung-Sheng Li, Philip Shi-lung Yu, Vittorio Castelli, Adaptive similarity searching in sequence databases ,(1996)
Michael Lamport Commons, Mitzi Sturgeon White, Intelligent control with hierarchical stacked neural networks ,(2014)
David B. Fogel, Lauren C. Stayton, On the effectiveness of crossover in simulated evolutionary optimization. BioSystems. ,vol. 32, pp. 171- 182 ,(1994) , 10.1016/0303-2647(94)90040-X