作者: Florian Meier , Alexander Bazo , Manuel Burghardt , Christian Wolff
DOI: 10.1007/978-3-642-39253-5_27
关键词: Test (assessment) 、 World Wide Web 、 Web application 、 Laboratory test 、 Stress (linguistics) 、 Asynchronous communication 、 Human–computer interaction 、 Information architecture 、 Crowdsourcing 、 Computer science
摘要: We present a web-based tool for evaluating the information architecture of website. The allows use crowdsourcing platforms like Amazon's MTurk as means recruiting test persons, and to conduct asynchronous remote navigation stress tests (cf. Instone 2000). also report on an evaluation study which compares our tool-based crowdsourced approach more traditional laboratory setting. Results this comparison indicate that although there are interesting differences between two testing approaches, both lead similar results.