作者: Sundeep Vaddadi , Chong U. Lee , John H. Hong , Onur C. Hamsici
DOI:
关键词:
摘要: A normalization process is implemented at a difference of scale space to completely or substantially reduce the effect that illumination changes has on feature/keypoint detection in an image. An image may be processed by progressively blurring using smoothening function generate smoothened for generated taking between two different versions normalized dividing third version image, where as smooth smoother than smoothest The then used detect one more features/keypoints