作者: B. Fischl , E.L. Schwartz
关键词:
摘要: In the machine vision community multi-scale image enhancement and analysis has frequently been accomplished using a diffusion or equivalent process. Linear can be replaced by convolution with Gaussian kernels, as is Green's function of such system. this paper we present technique which obtains an approximate solution to nonlinear process via integral equation analog convolution. The kernel plays same role that does for linear PDE, allowing direct PDE specific time without requiring integration through intermediate times. We then use learning arbitrary input images. result improvement in speed noise-sensitivity, well providing parallel algorithm.