作者: Sébastien Loustau
DOI:
关键词:
摘要: The effect of errors in variables empirical minimization is investigated. Given a loss $l$ and set decision rules $\mathcal{G}$, we prove general upper bound for an based on deconvolution kernel noisy sample $Z_i=X_i+\epsilon_i,i=1,\ldots,n$. We apply this to give the rate convergence expected excess risk clustering. A recent from \citet{levrard} proves that $\mathcal{O}(1/n)$ direct case, under Pollard's regularity assumptions. Here measurements gives form $\mathcal{O}(1/n^{\frac{\gamma}{\gamma+2\beta}})$, where $\gamma$ Holder density $X$ whereas $\beta$ degree illposedness.