作者: John C Duchi , Michael I Jordan , Martin J Wainwright , None
DOI: 10.1145/2666468
关键词:
摘要: We study statistical risk minimization problems under a privacy model in which the data is kept confidential even from learner. In this local framework, we establish sharp upper and lower bounds on convergence rates of estimation procedures. As consequence, exhibit precise tradeoff between amount preserves utility, as measured by rate, any estimator or learning procedure.