Loughborough University
Leicestershire, UK
LE11 3TU
+44 (0)1509 263171
Loughborough University

Loughborough University Institutional Repository

Please use this identifier to cite or link to this item: https://dspace.lboro.ac.uk/2134/15754

Title: The need for evidence innovation in educational technology evaluation
Authors: King, Melanie
Dawson, Ray
Batmaz, Firat
Rothberg, Steve
Issue Date: 2014
Publisher: British Computer Society
Citation: KING, M ... et al., 2014. The need for evidence innovation in educational technology evaluation. IN: Uhomoibi, J. ... et al. (eds.) Proceedings of INSPIRE XIX: Global Issues in IT Education, Southampton, UK, 15 April 2014, pp. 9 - 23.
Abstract: More complex and chaotic methods are being adopted in the development of technology to enhance learning and teaching in higher education today in order to achieve innovation in teaching practice. However, because this type of development does not conform to a linear process-driven order, it is notoriously difficult to evaluate its success as a holistic educational initiative. It is proposed that there are five factors that impact on effective educational technology evaluation, which contributes to insubstantial evidence of positive outcomes, these being: premature timing; inappropriate software evaluation techniques and models; lack of shared understanding of the terminology or the semantics of education technology; the growing complexity of agile and open development; and the corporatisation of higher education. This paper suggests that it is no longer helpful for policy makers to evaluate whether educational technology project outcomes were successful or unsuccessful but instead they should use agile evaluation strategies to understand the impact of the product, process and outcomes in a changing context. It is no longer useful to ask the question, ‘did the software work?’ The key is for software developers and policy-makers to ask ‘what type of software works, in which conditions and for whom?’ To understand this, the software development community needs to look at adopting evaluation strategies from the social science community. For example, realist evaluation supplies context driven and evidence-based techniques, exploring outcomes that tend towards the social rather than technical. It centres on exploring the ‘mechanisms’, ‘contexts’ and ‘outcomes’ associated with an intervention and is a form of theory-driven evaluation that is the theory and reasoning of its stakeholders that is rooted in practitioner wisdom.
Description: This is a conference paper. It was presented at the 19th Annual INSPIRE conference.
Version: Accepted for publication
URI: https://dspace.lboro.ac.uk/2134/15754
ISBN: 978-0-9926958-2-8
Appears in Collections:Conference Papers and Presentations (Centre for Engineering and Design Education)

Files associated with this item:

File Description SizeFormat
final_inspire_april-14.pdfAccepted version96.86 kBAdobe PDFView/Open


SFX Query

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.