作者: Jian Wang , Yi-Fei Pu
DOI:
关键词: Artificial neural network 、 Computer science 、 Algorithm 、 Partial derivative 、 Singularity 、 Fractional calculus 、 Backpropagation 、 Global optimization 、 Method of steepest descent 、 Function approximation
摘要: This paper offers a novel mathematical approach, the modified Fractional-order Steepest Descent Method (FSDM) for training BackPropagation Neural Networks (BPNNs); this differs from majority of previous approaches and as such. A promising method, fractional calculus, has potential to assume prominent role in applications neural networks cybernetics because its inherent strengths such long-term memory, nonlocality, weak singularity. Therefore, improve optimization performance classic first-order BPNNs, we study whether it could be possible FSDM generalize BPNNs based Backpropagation (FBPNNs). Motivated by inspiration, proposes state-of-the-art application calculus implement FBPNN whose reverse incremental search is negative directions approximate fractional-order partial derivatives square error. At first, theoretical concept described mathematically. Then, proof global optimal convergence, an assumption structure, multi-scale are analysed detail. Finally, perform comparative experiments compare with BPNN, i.e., example function approximation, optimization, two performances real data. The more efficient searching capability determine solution major advantage being superior BPNN.