Comprehensibility & Overfitting Avoidance in Genetic Programming for Technical Trading Rules

作者: Mukund Seshadri , Lee A Becker

DOI:

关键词:

摘要: This paper presents two methods for increasing comprehensibility in technical trading rules produced by Genetic Programming. For this application domain adding a complexity penalizing factor to the objective fitness function also avoids overfitting training data. Using pre-computed derived indicators, although it biases search, can express while retaining comprehensibility. Several of learned outperform buy and hold strategy S&P500 on testing period from 1990-2002, even taking into account transaction costs.

参考文章(18)
B. A. Shepherd, An appraisal of a decision tree approach to image classification international joint conference on artificial intelligence. pp. 473- 475 ,(1983)
Michael J. Pazzani, W. Rodman Shankle, Subramani Mani, Beyond concise and colorful: learning intelligible rules knowledge discovery and data mining. pp. 235- 238 ,(1997)
David D. Jensen, Paul R. Cohen, Multiple Comparisons in Induction Algorithms Machine Learning. ,vol. 38, pp. 309- 338 ,(2000) , 10.1023/A:1007631014630
Alex A. Freitas, Celia C. Bojarczuk, Heitor S. Lopes, Data Mining with Constrained-syntax Genetic Programming: Applications in Medical Data Sets ,(2001)
John Mingers, An Empirical Comparison of Pruning Methods for Decision Tree Induction Machine Learning. ,vol. 4, pp. 227- 243 ,(1989) , 10.1023/A:1022604100933
Pedro Domingos, The Role of Occam‘s Razor in Knowledge Discovery Data Mining and Knowledge Discovery. ,vol. 3, pp. 409- 425 ,(1999) , 10.1023/A:1009868929893
John E. Laird, Paul S. Rosenbloom, Allen Newell, Chunking in Soar: the anatomy of a general learning mechanism Machine Learning. ,vol. 1, pp. 11- 46 ,(1993) , 10.1023/A:1022639103969
Wray Buntine, Tim Niblett, A Further Comparison of Splitting Rules for Decision-Tree Induction Machine Learning. ,vol. 8, pp. 75- 85 ,(1992) , 10.1023/A:1022686419106