作者: Aleksander Kołcz , Nigel M. Allinson
DOI: 10.1016/S0925-2312(99)00110-1
关键词:
摘要: A generalization of a class neural network architectures based on multiple quantization input space combined with memory lookup operations is presented under the name general (GMNN). Within this common framework it shown that networks type are - for variety learning schemes response-equivalent to basis function (i.e., radial and kernel regression networks). In particular, equivalence holds even if GMNN does not employ explicit functions, which makes architecture attractive from an implementational point view allows fast operation, both in response modes. Variants discussed examples existing conforming given.