Loughborough University
Browse
Bacon04.pdf (39.22 kB)

Responding to student expectations for assessments

Download (39.22 kB)
conference contribution
posted on 2006-05-25, 11:23 authored by Richard A. Bacon
In a previous paper (Bacon 2003) the author described some of the results obtained from a survey of the use of the SToMP testing system for coursework assessment of a first year Data Handling course within a Physics Degree programme. This paper will deal with the modifications that have been made as a result of the student feedback from that trial, with a preliminary analysis of the feedback obtained from the students using the updated tests, and with a further trial of a more sophisticated response to the feedback. The SToMP testing system was written to be direct implementation of the IMS-QTI v1.2 specification, but includes several extensions for handling numeric questions of the type most frequently found within science and engineering. Such questions typically require a numeric answer to be judged by its precision (e.g. the number of significant figures) as well as its accuracy (i.e. whether the value falls within a specified range) and to be able to recognised alternative forms of the same value and precision in scientific format. These features were mapped onto an extension of the QTI specification for ease of implementation, and included other features such as alternative number bases and the randomisation of values within questions. The questionnaire that was used in the earlier paper has been used again with the current cohort of students using an updated version of the testing system and with some questions having been modified in the light of earlier feedback. A major objection voiced by the students in the first trial was the lack of ability to show working, and to get credit for correct working if the final numeric answer was incorrect. This has not been directly addressed within these tests, but an associated problem has been addressed. This concerns multi-part questions and the awarding of marks for later parts that are not correct, but are consistent with earlier errors. The paper will discuss some of the issues relevant to this feature, and how it was perceived by students. Results will also be presented from a second study into how correct working for incorrect answers to numeric questions can be credited, to some extent. This system allows numeric expressions to be entered by the students as well as their final numeric value. It has been implemented in a test for second year students on a course concerned with radiation detectors. Clearly this is not being represented as a total solution to the problem but it can be seen as a first step towards an acceptable solution. The student responses to this development could be of particular significance, since they are the same students whose responses have been reported in the earlier paper and that led to the changes being reported here.

History

School

  • University Academic and Administrative Support

Department

  • Professional Development

Research Unit

  • CAA Conference

Pages

40157 bytes

Citation

BACON, R.A., 2003. Responding to Student Expectations for Assessments. IN: Proceedings of the 8th CAA Conference, Loughborough: Loughborough University

Publisher

© Loughborough University

Publication date

2004

Notes

This is a conference paper.

Language

  • en

Usage metrics

    Loughborough Publications

    Categories

    No categories selected

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC