作者: Jialin Zhang , Xiaoming Sun , Wei Chen , Zhijie Zhang
DOI:
关键词:
摘要: We revisit the optimization from samples (OPS) model, which studies problem of optimizing objective functions directly sample data. Previous results showed that we cannot obtain a constant approximation ratio for maximum coverage using polynomially many independent form $\{S_i, f(S_i)\}_{i=1}^t$ (Balkanski et al., 2017), even if are $(1 - \epsilon)$-PMAC learnable these (Badanidiyuru 2012), means most function values can be approximately learned very well with high probability. In this work, to circumvent impossibility result OPS, propose stronger model called structured (OPSS) functions, where data encode structural information functions. show under three general assumptions on distributions, design efficient OPSS algorithms achieve problem. further prove lower bound assumptions, is tight when not considering computational efficiency. Moreover, also remove any one has no approximation.