作者: Meng Joo Er , Zhifei Shao , Ning Wang
DOI: 10.1007/978-3-642-39065-4_21
关键词: Machine learning 、 Reduction (complexity) 、 Feedforward neural network 、 Generalization 、 Ridge (differential geometry) 、 Computer science 、 Regression 、 Power (physics) 、 Randomness 、 Artificial intelligence 、 Extreme learning machine
摘要: In recent years, Extreme Learning Machine (ELM) has attracted comprehensive attentions as a universal function approximator. Comparing to other single layer feedforward neural networks, its input parameters of hidden neurons can be randomly generated rather than tuned, and thereby saving huge amount computational power. However, it been pointed out that the randomness ELM would result in fluctuating performances. this paper, we intensively investigate reduction effect by using regularized version ELM, named Ridge (RELM). Previously, RELM shown achieve generally better generalization original ELM. Furthermore, try demonstrate also greatly reduce performance with 12 real world regression tasks. An insight into is given.