作者: Vladimir Pavlovic , Minyoung Kim
DOI: 10.7282/T3PV6KQN
关键词:
摘要: Many prediction problems that arise in computer vision and robotics can be formulated within a regression framework. Unlike traditional problems, tasks are often characterized by varying number of output variables with complex dependency structures. The further aggravated the high dimensionality input. In this thesis, I address two challenging related to learning regressors such settings: (1) developing discriminative approaches handle structured variables, (2) reducing input while preserving statistical correlation output. A structure effectively captured probabilistic graphical models. contrast joint data modeling for models, propose conditional models approach directly ultimate objective. While as Conditional Random Fields (CRFs) has attracted significant interest past, setting been rarely explored. work first extend CRF discriminatively trained HMM methods problem. different based on directed undirected second parameter is cast convex optimization problem, accompanied new effective handles density integrability constraint. Experiments several problem domains, including human motion robot-arm state estimation, indicate yield accuracy comparable or better than state-of-the-art approaches. In part consider task finding low-dimensional representation covariates regressing This task, known reduction (DRR), particularly useful when visualizing high-dimensional data, efficiently designing reduced dimension, eliminating noise uncovering essential information predicting common many machine tasks, their use settings not widespread. A recent DRR have proposed statistics community but suffer from limitations, non-convexity need slicing potentially space. these issues proposing novel covariance operators reproducing kernel Hilbert spaces (RKHSes) provide closed-form solution without explicit slicing. benefits demonstrated comprehensive set evaluations important pattern recognition.