Tablet versus paper marking in assessment

feedback matters

Alan Denison* (Corresponding Author), Emily Bate, Jessica Thompson

*Corresponding author for this work

Research output: Contribution to journalArticle

11 Citations (Scopus)
3 Downloads (Pure)

Abstract

Background

The Objective Structured Clinical Examination (OSCE) is a cornerstone in healthcare assessment. As a potential tool for providing learner-centred feedback on a large scale, the use of tablet devices has been proposed for the recording of OSCE marks, moving away from the traditional, paper-based checklist.

Methods

Examiner-recorded comments were collated from successive first year formative and summative OSCE examinations, with paper-based checklists used in 2012 and iPad-based checklists used in 2013. A total of 558 and 498 examiner-candidate interactions took place in the January OSCE examinations, and 1402 and 1344 for the May OSCE examination for 2012 and 2013 respectively. Examiner comments were analyzed for quantity and quality. A tool was developed and validated to assess the quality of the comments left by examiners for use as feedback (Kappa = 0.625).

Results

A direct comparison of paper-based checklists and iPad-recorded examinations showed an increase in the quantity of comments left from 41 to 51 % (+ 10 %). Furthermore, there was an increase in the number of comments left for students deemed ‘borderline’: + 22 %. In terms of the quality of the comments for feedback, there was a significant improvement (p < 0.001) between comments left in written-recorded and iPad-recorded examinations.

Conclusions

iPad-marked examinations resulted in a greater quantity and quality of examiner comment for use as feedback, particularly for students performing less well, enabling tutors to direct further learning for these students.

Original languageEnglish
Pages (from-to)108-113
Number of pages6
JournalPerspectives on medical education
Volume5
Issue number2
Early online date14 Mar 2016
DOIs
Publication statusPublished - 30 Apr 2016

Fingerprint

Checklist
Tablets
examination
Students
examiner
Learning
Delivery of Health Care
Equipment and Supplies
student
tutor
recording
candidacy

Keywords

  • Assessment
  • Feedback
  • OSCE
  • Technology
  • Undergraduate medical education

ASJC Scopus subject areas

  • Education
  • Medicine(all)

Cite this

Tablet versus paper marking in assessment : feedback matters. / Denison, Alan (Corresponding Author); Bate, Emily; Thompson, Jessica.

In: Perspectives on medical education, Vol. 5, No. 2, 30.04.2016, p. 108-113.

Research output: Contribution to journalArticle

Denison, Alan ; Bate, Emily ; Thompson, Jessica. / Tablet versus paper marking in assessment : feedback matters. In: Perspectives on medical education. 2016 ; Vol. 5, No. 2. pp. 108-113.
@article{adb4928b0f414ee7a9f225c726644b05,
title = "Tablet versus paper marking in assessment: feedback matters",
abstract = "BackgroundThe Objective Structured Clinical Examination (OSCE) is a cornerstone in healthcare assessment. As a potential tool for providing learner-centred feedback on a large scale, the use of tablet devices has been proposed for the recording of OSCE marks, moving away from the traditional, paper-based checklist.MethodsExaminer-recorded comments were collated from successive first year formative and summative OSCE examinations, with paper-based checklists used in 2012 and iPad-based checklists used in 2013. A total of 558 and 498 examiner-candidate interactions took place in the January OSCE examinations, and 1402 and 1344 for the May OSCE examination for 2012 and 2013 respectively. Examiner comments were analyzed for quantity and quality. A tool was developed and validated to assess the quality of the comments left by examiners for use as feedback (Kappa = 0.625).ResultsA direct comparison of paper-based checklists and iPad-recorded examinations showed an increase in the quantity of comments left from 41 to 51 {\%} (+ 10 {\%}). Furthermore, there was an increase in the number of comments left for students deemed ‘borderline’: + 22 {\%}. In terms of the quality of the comments for feedback, there was a significant improvement (p < 0.001) between comments left in written-recorded and iPad-recorded examinations.ConclusionsiPad-marked examinations resulted in a greater quantity and quality of examiner comment for use as feedback, particularly for students performing less well, enabling tutors to direct further learning for these students.",
keywords = "Assessment, Feedback, OSCE, Technology, Undergraduate medical education",
author = "Alan Denison and Emily Bate and Jessica Thompson",
note = "University of Aberdeen Medical Education Summer Teaching Bursary.",
year = "2016",
month = "4",
day = "30",
doi = "10.1007/s40037-016-0262-8",
language = "English",
volume = "5",
pages = "108--113",
journal = "Perspectives on medical education",
issn = "2212-2761",
publisher = "Bohn Stafleu van Loghum",
number = "2",

}

TY - JOUR

T1 - Tablet versus paper marking in assessment

T2 - feedback matters

AU - Denison, Alan

AU - Bate, Emily

AU - Thompson, Jessica

N1 - University of Aberdeen Medical Education Summer Teaching Bursary.

PY - 2016/4/30

Y1 - 2016/4/30

N2 - BackgroundThe Objective Structured Clinical Examination (OSCE) is a cornerstone in healthcare assessment. As a potential tool for providing learner-centred feedback on a large scale, the use of tablet devices has been proposed for the recording of OSCE marks, moving away from the traditional, paper-based checklist.MethodsExaminer-recorded comments were collated from successive first year formative and summative OSCE examinations, with paper-based checklists used in 2012 and iPad-based checklists used in 2013. A total of 558 and 498 examiner-candidate interactions took place in the January OSCE examinations, and 1402 and 1344 for the May OSCE examination for 2012 and 2013 respectively. Examiner comments were analyzed for quantity and quality. A tool was developed and validated to assess the quality of the comments left by examiners for use as feedback (Kappa = 0.625).ResultsA direct comparison of paper-based checklists and iPad-recorded examinations showed an increase in the quantity of comments left from 41 to 51 % (+ 10 %). Furthermore, there was an increase in the number of comments left for students deemed ‘borderline’: + 22 %. In terms of the quality of the comments for feedback, there was a significant improvement (p < 0.001) between comments left in written-recorded and iPad-recorded examinations.ConclusionsiPad-marked examinations resulted in a greater quantity and quality of examiner comment for use as feedback, particularly for students performing less well, enabling tutors to direct further learning for these students.

AB - BackgroundThe Objective Structured Clinical Examination (OSCE) is a cornerstone in healthcare assessment. As a potential tool for providing learner-centred feedback on a large scale, the use of tablet devices has been proposed for the recording of OSCE marks, moving away from the traditional, paper-based checklist.MethodsExaminer-recorded comments were collated from successive first year formative and summative OSCE examinations, with paper-based checklists used in 2012 and iPad-based checklists used in 2013. A total of 558 and 498 examiner-candidate interactions took place in the January OSCE examinations, and 1402 and 1344 for the May OSCE examination for 2012 and 2013 respectively. Examiner comments were analyzed for quantity and quality. A tool was developed and validated to assess the quality of the comments left by examiners for use as feedback (Kappa = 0.625).ResultsA direct comparison of paper-based checklists and iPad-recorded examinations showed an increase in the quantity of comments left from 41 to 51 % (+ 10 %). Furthermore, there was an increase in the number of comments left for students deemed ‘borderline’: + 22 %. In terms of the quality of the comments for feedback, there was a significant improvement (p < 0.001) between comments left in written-recorded and iPad-recorded examinations.ConclusionsiPad-marked examinations resulted in a greater quantity and quality of examiner comment for use as feedback, particularly for students performing less well, enabling tutors to direct further learning for these students.

KW - Assessment

KW - Feedback

KW - OSCE

KW - Technology

KW - Undergraduate medical education

UR - http://www.scopus.com/inward/record.url?scp=85012006451&partnerID=8YFLogxK

U2 - 10.1007/s40037-016-0262-8

DO - 10.1007/s40037-016-0262-8

M3 - Article

VL - 5

SP - 108

EP - 113

JO - Perspectives on medical education

JF - Perspectives on medical education

SN - 2212-2761

IS - 2

ER -