作者: Jérôme Euzenat , Christian Meilicke , Heiner Stuckenschmidt , Pavel Shvaiko , Cássia Trojahn
DOI: 10.1007/978-3-642-22630-4_6
关键词:
摘要: In the area of semantic technologies, benchmarking and systematic evaluation is not yet as established in other areas computer science, e.g., information retrieval. spite successful attempts, more effort experience are required order to achieve such a level maturity. this paper, we report results lessons learned from Ontology Alignment Evaluation Initiative (OAEI), initiative for ontology matching. The goal work twofold: on one hand, document state art evaluating matching methods provide potential participants with better understanding design underlying principles OAEI campaigns. On experiences gained particular technologies developers kinds systems. For purpose, describe used campaigns terms datasets, criteria workflows, global view carried out 2005 2010 discuss upcoming trends, both specific generally relevant technologies. Finally, argue that there need further automation shorten feedback cycle tool developers.