Evaluating the evaluation tools: methodological issues in the FAST project

作者: Laura Hills , Chris Glover

DOI:

关键词: Knowledge managementScience teachingStudent learningFormative assessmentPrincipal (computer security)Open universityQuality (business)Project teamEngineeringThe Conceptual Framework

摘要: Assessment is now understood to be a key issue in influencing what and how students learn and, through the feedback they receive, their understanding future learning. The Formative Science Teaching (FAST) project an FDTL funded collaboration between Open University Sheffield Hallam University. aims of are: • investigate impact existing formative assessment practices on student learning behaviour develop, implement evaluate new approaches providing with timely useful feedback. theoretical foundation FAST that there are 11 conditions under which best supports (Gibbs Simpson, 2004). Derived from comprehensive literature review theories case studies assessment, these form conceptual framework for project, evaluation tools developed by team. This paper seeks usefulness principal tool used project: Experience Questionnaire (AEQ). AEQ has been extensively increasingly other institutions, designed as diagnostic lecturers assess extent experience assessment. uses six scales items, each addressing at least one conditions: 1. Time demands distribution effort 2. Assignments 3. Quantity timing 4. Quality 5. Student use 6. examination Drawing interviews lecturers, questionnaire findings over three years, this discusses practical application, limitations, tool. Using comparisons it also address methodological issues raised AEQ, suggests ways conjunction methods, can means better practices.

参考文章(15)
Sherry Lee Linkon, How Can Assessment Work for Us? Academe. ,vol. 91, pp. 28- 32 ,(2005) , 10.2307/40253427
Roy Ballantyne, Jill Borthwick, Jan Packer, Beyond Student Evaluation of Teaching: Identifying and addressing academic staff development needs Assessment & Evaluation in Higher Education. ,vol. 25, pp. 221- 236 ,(2000) , 10.1080/713611430
Peter Zeegers, A Revision of the Biggs' Study Process Questionnaire (R-SPQ) Higher Education Research & Development. ,vol. 21, pp. 73- 92 ,(2002) , 10.1080/07294360220124666
Katrien Struyven, Filip Dochy, Steven Janssens, Students’ perceptions about evaluation and assessment in higher education: a review1 Assessment & Evaluation in Higher Education. ,vol. 30, pp. 325- 341 ,(2005) , 10.1080/02602930500099102
Kam‐por Kwan, How Fair are Student Ratings in Assessing the Teaching Performance of University Teachers Assessment & Evaluation in Higher Education. ,vol. 24, pp. 181- 195 ,(1999) , 10.1080/0260293990240207
Mark Nichols, Nicky Gardner, Evaluating Flexible Delivery across a Tertiary Institution Open Learning: The Journal of Open and Distance Learning. ,vol. 17, pp. 11- 22 ,(2002) , 10.1080/02680510120110148
Evelyn Brown, Graham Gibbs, Chris Glover, Evaluation tools for investigating the impact of assessment regimes on student learning Bioscience Education. ,vol. 2, pp. 1- 7 ,(2003) , 10.3108/BEEJ.2003.02000006
Patricia M. Lyon, Graham D. Hendry, The Use of the Course Experience Questionnaire as a Monitoring Evaluation Tool in a Problem-based Medical Programme Assessment & Evaluation in Higher Education. ,vol. 27, pp. 339- 352 ,(2002) , 10.1080/0260293022000001355