Distributed usability evaluation

作者: Lars Christensen , Erik Frøkjær

DOI: 10.1145/1868914.1868932

关键词:

摘要: We present DUE (Distributed Usability Evaluation), a technique for collecting and evaluating usability data. The infrastructure involves client-server network. A client-based tool resides on the workstation of each user, providing screen video recording, microphone input voice commentary, window severity rating. idea is user to work naturalistically, clicking button when problem or point uncertainty encountered, describe it verbally along with illustrating screen, rate its severity. These incidents are accumulated server, access an evaluator (usability expert) product developers managers who want review analyse them. supports evaluation in development stages from running prototypes onwards. case study use corporate environment presented. indicates that effective terms low bias, high efficiency, clear communication issues among users, evaluators developers. Further, supporting long-term evaluations making possible empirical studies learnability.

参考文章(25)
James R. Lewis, Sample Sizes for Usability Studies: Additional Considerations Human Factors. ,vol. 36, pp. 368- 378 ,(1994) , 10.1177/001872089403600215
AsbjØRn F⊘lstad, Work-Domain Experts as Evaluators: Usability Inspection of Domain-Specific Work-Support Systems International Journal of Human-computer Interaction. ,vol. 22, pp. 217- 245 ,(2007) , 10.1080/10447310709336963
Niels Ebbe Jacobsen, Morten Hertzum, Bonnie E. John, The evaluator effect in usability tests human factors in computing systems. pp. 255- 256 ,(1998) , 10.1145/286498.286737
Kasper Hornbæk, Erik Frøkjær, Making use of business goals in usability evaluation Proceeding of the twenty-sixth annual CHI conference on Human factors in computing systems - CHI '08. pp. 903- 912 ,(2008) , 10.1145/1357054.1357197
Scott Bateman, Carl Gutwin, Nathaniel Osgood, Gordon McCalla, Interactive usability instrumentation engineering interactive computing system. pp. 45- 54 ,(2009) , 10.1145/1570433.1570443
HR Hartson, JC Castillo, J Kelso, WC Neale, Remote evaluation Proceedings of the SIGCHI conference on Human factors in computing systems common ground - CHI '96. pp. 228- 235 ,(1996) , 10.1145/238386.238511
Kasper Hornbæk, Erik Frøkjær, Comparison of techniques for matching of usability problem descriptions Interacting with Computers. ,vol. 20, pp. 505- 514 ,(2008) , 10.1016/J.INTCOM.2008.08.005
Jared Spool, Will Schroeder, Testing web sites CHI '01 extended abstracts on Human factors in computing systems - CHI '01. pp. 285- 286 ,(2001) , 10.1145/634067.634236