Loughborough University
Browse
Head_S_Ogden_C_j1.pdf (64.67 kB)

Development of a searchable database of veterinary MCQs with educational feedback, for independent learning

Download (64.67 kB)
conference contribution
posted on 2009-05-01, 12:10 authored by S. Head, C. Ogden
The OCTAVE Project aims to provide the students of the English veterinary schools with a database of Multiple Choice Questions (MCQs) to support their learning. The questions have been written by veterinary academic staff and practitioners, and contain educational feedback to aid the students’ understanding of the correct response. There is always a problem when assembling a database to serve multiple institutions in that the curriculum content and sequence is likely to be different. Therefore, it is essential that the students can select the appropriate categories of questions to use. In order to make the database readily searchable, questions have been meta-tagged so that students from any institution can make selections in a defined subject area. The search tags for the questions are: • Stage of course: i.e. preclinical or clinical • Species • Body system • Discipline- scientific/clinical • Sub-disciplines The search occurs as each tag is chosen and the number of questions available after each search is indicated. This allows students to decide whether they want to focus the search further or whether they are happy to be presented with all the questions in a certain area. Students can choose to attempt the selected questions in three different ways: • Assessment mode – correct/incorrect score given only • Assessment /Revision mode with correct/incorrect indication and running total given, and with individual question feedback available after taking all questions • Revision mode/instant feedback which is given after attempting each question Student activity is recorded and students may retake a previous test or may choose to review/retake only those questions which they previously answered incorrectly. The feedback that is offered to the student has three components: If the chosen answer was incorrect the feedback: • explains why that option is not correct • gives a hint to the correct answer- but does NOT give the answer • provides a reference for further study If the correct answer is chosen, the feedback: • confirms and reinforces that the answer IS correct. • gives some further useful information (like icing on the cake) • provides a reference for further study This form of feedback follows best educational practice in identifying deficiencies of logic, stimulating student reflection, and offering extra references and information as a “carrot” for completion. The database may also be used by lecturers at each institution in similar modes, or to select questions for use in institutional assessments. The responses given by students for each question are recorded so that subsequent analysis can determine: the effectiveness of the question, frequency of choice of each distracter and the level of difficulty of the question. This will allow lecturers to choose questions of known difficulty to present to students for formative examinations or use in summative examinations. 3072 questions have been authored onto a Microsoft Word Template, peer reviewed, assembled into an Excel spreadsheet, tagged and imported into the Speedwell Database 'WebQuest'. This will be made available to be accessed through the web by authenticated veterinary students at each of the English veterinary schools after testing is completed.

History

School

  • University Academic and Administrative Support

Department

  • Professional Development

Research Unit

  • CAA Conference

Citation

HEAD, S. and OGDEN, C., 2006. Development of a searchable database of veterinary MCQs with educational feedback, for independent learning. IN: Danson, M. (ed.). 10th CAA International Computer Assisted Assessment Conference : Proceedings of the Conference on 4th and 5th July 2006 at Loughborough University. Loughborough : Lougborough University, pp. 213-224

Publisher

© Loughborough University

Version

  • AM (Accepted Manuscript)

Publication date

2006

Notes

This is a conference paper.

ISBN

095395725X

Language

  • en

Usage metrics

    Loughborough Publications

    Categories

    No categories selected

    Keywords

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC