作者: Krzysztof Gra֒bczewski , Norbert Jankowski
DOI:
关键词:
摘要: Learning from data may be a very complex task. To satisfactorily solve variety of problems, many different types algorithms need to combined. Feature extraction are valuable tools, which prepare for other learning methods. estimate their usefulness one must examine the whole processes they parts of. The goal chapter is present short survey approaches with special emphasis on solving classification (and approximation) where feature plays particularly important role. We address this review readers who know basics field and would like get quickly acquainted techniques less familiar with. For novice we recommend textbooks by Duda et al. (2001), Mitchell (1997), Bishop (1995), Haykin (1994), Cherkassky Mulier (1998), Schalkoff (1992), de Sa Hastie Ripley (1996), Scholkopf Smola Friedman (2001). Our tutorial starts mathematical statement problem. Then, it presents two general induction principles: risk minimization Bayesian learning, that widely applied in next chapters. Classification then discussed more details, including: Naive Bayes, Linear Discriminant Analysis, kernel methods, Neural Networks, similarity based Decision Trees.