作者: M. Thamban Nair , Markus Hegland , Robert S. Anderssen
DOI: 10.1090/S0025-5718-97-00811-9
关键词: Mathematical analysis 、 Integral equation 、 Mathematics 、 Operator (computer programming) 、 Rate of convergence 、 Numerical analysis 、 Tikhonov regularization 、 Least squares 、 Stability (learning theory) 、 Convergence (routing)
摘要: When deriving rates of convergence for the approximations generated by application Tikhonov regularization to ill-posed operator equations, assumptions must be made about nature stabilization (i.e., choice seminorm in regularization) and regularity least squares solutions which one looks for. In fact, it is clear from works Hegland, Engl Neubauer Natterer that, terms rate convergence, there a trade-off between regularity. It this matter examined paper means best-possible worst-error estimates. The results provide better estimates than those Neubauer, also include extend best possible derived Natterer. concludes with an these first-kind integral equations smooth kernels.