作者: Shikui Tu , Lei Xu
DOI: 10.1007/S11460-011-0146-Y
关键词: Artificial intelligence 、 Model selection 、 Principal component analysis 、 Minimax 、 Statistical hypothesis testing 、 Bayesian information criterion 、 Bayesian probability 、 Akaike information criterion 、 Mathematics 、 Pattern recognition 、 Sample size determination
摘要: Based on the problem of detecting number signals, this paper provides a systematic empirical investigation model selection performances several classical criteria and recently developed methods (including Akaike’s information criterion (AIC), Schwarz’s Bayesian criterion, Bozdogan’s consistent AIC, Hannan-Quinn Minka’s (MK) principal component analysis (PCA) Kritchman & Nadler’s hypothesis tests (KN), Perry Wolfe’s minimax rank estimation thresholding algorithm (MM), Ying-Yang (BYY) harmony learning), by varying signal-to-noise ratio (SNR) training sample size N. A family indifference curves is defined contour lines accuracies, such that we can examine joint effect N SNR rather than merely either with other fixed as usually done in literature. The visually reveal all demonstrate relative advantages obviously within region moderate SNR. Moreover, importance studying also confirmed an alternative reference maximizing testing likelihood. It has been shown via extensive simulations AIC BYY learning, well MK, KN, MM, are relatively more robust others against decreasing SNR, superior for small size.