作者: Jayadeva
DOI: 10.1016/J.NEUCOM.2014.07.062
关键词:
摘要: The VC dimension measures the complexity of a learning machine, and low leads to good generalization. While SVMs produce state-of-the-art performance, it is well known that SVM can be unbounded; despite results in practice, there no guarantee In this paper, we show how learn hyperplane classifier by minimizing an exact, or ? bound on its dimension. proposed approach, termed as Minimal Complexity Machine (MCM), involves solving simple linear programming problem. Experimental show, number benchmark datasets, approach learns classifiers with error rates much less than conventional SVMs, while often using fewer support vectors. On many vectors one-tenth used indicating MCM does indeed simpler representations. HighlightsWe exact dimension.A fractional problem formulated reduced LP problem.Linear kernel versions are explored.The called Machine, generalizes better SVMs.On numerous uses far SVMs.