作者: L. Dittmar , W. Sturzl , E. Baird , N. Boeddeker , M. Egelhaaf
DOI: 10.1242/JEB.043737
关键词:
摘要: SUMMARY Visual landmarks guide humans and animals including insects to a goal location. Insects, with their miniature brains, have evolved simple strategy find nests or profitable food sources; they approach by finding close match between the current view memorised retinotopic representation of landmark constellation around goal. Recent implementations such matching scheme use raw panoramic images (‘image matching’) show that it is well suited work on robots even in natural environments. However, this works only if relevant can be detected contrast texture. Therefore, we tested how honeybees perform localising hardly distinguished from background cues. We recorded honeybees9 flight behaviour high-speed cameras compared search computer simulations. are able same texture as suggest bees relative motion cues background. These generated eyes when bee moves characteristic way vicinity landmarks. This extraordinary navigation performance explained includes snapshots based optic flow amplitudes (‘optic matching’). new provides robust for navigation, depends primarily depth structure environment.