作者: Mohammad-Taghi Vakil-Baghmisheh , Nikola Pavešić
关键词: Backpropagation 、 Benchmark (computing) 、 Multilayer perceptron 、 Computer science 、 Artificial neural network 、 Adaptive resonance theory 、 Artificial intelligence 、 Computational intelligence 、 Optical character recognition 、 Fuzzy logic
摘要: We present an algorithmic variant of the simplified fuzzy ARTMAP (SFAM) network, whose structure resembles those feed-forward networks. Its difference with Kasuba's model is discussed, and their performances are compared on two benchmarks. show that our algorithm much faster than algorithm, by increasing number training samples, in speed grows enormously. The SFAM MLP (multilayer perceptron) three problems: benchmarks, Farsi optical character recognition (OCR) problem. For different variants backpropagation used: BPLRF (backpropagation plummeting learning rate factor) for BST selective training) OCR problem. The results obtained all case studies SFAM, embedded customized systems, SFAM's convergence fast-training mode, MLP, online operation SFAM. On benchmark problems has better problem, error higher ill-engineered datasets, but equal well-engineered ones. The flexible configuration i.e. its capability to increase size network order learn new patterns, as well simple parameter adjustment, remain unchallenged MLP.