How are debriefing questions used in health discrete choice experiments? An online survey

Alison M Pearce* (Corresponding Author), Brendan J Mulhern, Verity Watson, Rosalie Viney

*Corresponding author for this work

Research output: Contribution to journalArticle

Abstract

Objectives: Debriefing questions can assess if respondents understand discrete choice experiments (DCEs) and are answering in a way consistent with theories of decision making and utility maximization. Nevertheless, there is limited literature about how often debriefing questions are included or how the results are used in health economics. The aim of this study was to conduct a survey of the frequency, type, and analysis of debriefing questions in health DCEs. Methods: We conducted an online survey of authors of published health DCEs, asking about their use of debriefing questions, including frequency, type, and analysis. We descriptively analyzed the sample characteristics and responses. Free-text questions were analyzed with qualitative thematic analysis. Results: We received 70 responses (43% response rate), of which 50% reported using debriefing questions. They were most commonly designed to assess difficulty (91%), understanding (49%), and attribute nonattendance (31%) rather than learning effects (3%) or monotonicity (11%). On average, 37% of debriefing questions were analyzed (range, 0% to 69%), and the results were used <50% of the time, usually to exclude respondents or interpret overall results. Researcher experience or confidence with DCEs did not affect their use of debriefing questions. Conclusions: These results suggest that although half of researchers conducting health DCEs use debriefing questions, many do not analyze, use, or report the responses. Given the additional respondent burden, there is a need for reliable and valid debriefing questions. In the meantime, the inclusion, analysis, and reporting of debriefing questions should be carefully considered before DCE implementation.

Original languageEnglish
JournalValue in Health
Early online date27 Nov 2019
DOIs
Publication statusE-pub ahead of print - 27 Nov 2019

Fingerprint

Health
Economics
Research Personnel
Decision Making
Surveys and Questionnaires
Learning

Keywords

  • debriefing questions
  • discrete choice experiment
  • patient preferences
  • survey methodology
  • theories of decision making
  • understanding

ASJC Scopus subject areas

  • Health Policy
  • Public Health, Environmental and Occupational Health

Cite this

How are debriefing questions used in health discrete choice experiments? An online survey. / Pearce, Alison M (Corresponding Author); Mulhern, Brendan J ; Watson, Verity; Viney, Rosalie.

In: Value in Health, 27.11.2019.

Research output: Contribution to journalArticle

@article{86ca4d87dcec4e189354afe78c83f364,
title = "How are debriefing questions used in health discrete choice experiments? An online survey",
abstract = "Objectives: Debriefing questions can assess if respondents understand discrete choice experiments (DCEs) and are answering in a way consistent with theories of decision making and utility maximization. Nevertheless, there is limited literature about how often debriefing questions are included or how the results are used in health economics. The aim of this study was to conduct a survey of the frequency, type, and analysis of debriefing questions in health DCEs. Methods: We conducted an online survey of authors of published health DCEs, asking about their use of debriefing questions, including frequency, type, and analysis. We descriptively analyzed the sample characteristics and responses. Free-text questions were analyzed with qualitative thematic analysis. Results: We received 70 responses (43{\%} response rate), of which 50{\%} reported using debriefing questions. They were most commonly designed to assess difficulty (91{\%}), understanding (49{\%}), and attribute nonattendance (31{\%}) rather than learning effects (3{\%}) or monotonicity (11{\%}). On average, 37{\%} of debriefing questions were analyzed (range, 0{\%} to 69{\%}), and the results were used <50{\%} of the time, usually to exclude respondents or interpret overall results. Researcher experience or confidence with DCEs did not affect their use of debriefing questions. Conclusions: These results suggest that although half of researchers conducting health DCEs use debriefing questions, many do not analyze, use, or report the responses. Given the additional respondent burden, there is a need for reliable and valid debriefing questions. In the meantime, the inclusion, analysis, and reporting of debriefing questions should be carefully considered before DCE implementation.",
keywords = "debriefing questions, discrete choice experiment, patient preferences, survey methodology, theories of decision making, understanding",
author = "Pearce, {Alison M} and Mulhern, {Brendan J} and Verity Watson and Rosalie Viney",
note = "We thank the respondents who completed our survey. This work was funded by a University of Technology Sydney Faculty of Business Research Grant 2017. Alison M. Pearce was supported by a University of Technology Sydney Chancellor’s Postdoctoral Research Fellowship.",
year = "2019",
month = "11",
day = "27",
doi = "10.1016/j.jval.2019.10.001",
language = "English",
journal = "Value in Health",
issn = "1098-3015",
publisher = "ACADEMIC PRESS INC ELSEVIER SCIENCE",

}

TY - JOUR

T1 - How are debriefing questions used in health discrete choice experiments? An online survey

AU - Pearce, Alison M

AU - Mulhern, Brendan J

AU - Watson, Verity

AU - Viney, Rosalie

N1 - We thank the respondents who completed our survey. This work was funded by a University of Technology Sydney Faculty of Business Research Grant 2017. Alison M. Pearce was supported by a University of Technology Sydney Chancellor’s Postdoctoral Research Fellowship.

PY - 2019/11/27

Y1 - 2019/11/27

N2 - Objectives: Debriefing questions can assess if respondents understand discrete choice experiments (DCEs) and are answering in a way consistent with theories of decision making and utility maximization. Nevertheless, there is limited literature about how often debriefing questions are included or how the results are used in health economics. The aim of this study was to conduct a survey of the frequency, type, and analysis of debriefing questions in health DCEs. Methods: We conducted an online survey of authors of published health DCEs, asking about their use of debriefing questions, including frequency, type, and analysis. We descriptively analyzed the sample characteristics and responses. Free-text questions were analyzed with qualitative thematic analysis. Results: We received 70 responses (43% response rate), of which 50% reported using debriefing questions. They were most commonly designed to assess difficulty (91%), understanding (49%), and attribute nonattendance (31%) rather than learning effects (3%) or monotonicity (11%). On average, 37% of debriefing questions were analyzed (range, 0% to 69%), and the results were used <50% of the time, usually to exclude respondents or interpret overall results. Researcher experience or confidence with DCEs did not affect their use of debriefing questions. Conclusions: These results suggest that although half of researchers conducting health DCEs use debriefing questions, many do not analyze, use, or report the responses. Given the additional respondent burden, there is a need for reliable and valid debriefing questions. In the meantime, the inclusion, analysis, and reporting of debriefing questions should be carefully considered before DCE implementation.

AB - Objectives: Debriefing questions can assess if respondents understand discrete choice experiments (DCEs) and are answering in a way consistent with theories of decision making and utility maximization. Nevertheless, there is limited literature about how often debriefing questions are included or how the results are used in health economics. The aim of this study was to conduct a survey of the frequency, type, and analysis of debriefing questions in health DCEs. Methods: We conducted an online survey of authors of published health DCEs, asking about their use of debriefing questions, including frequency, type, and analysis. We descriptively analyzed the sample characteristics and responses. Free-text questions were analyzed with qualitative thematic analysis. Results: We received 70 responses (43% response rate), of which 50% reported using debriefing questions. They were most commonly designed to assess difficulty (91%), understanding (49%), and attribute nonattendance (31%) rather than learning effects (3%) or monotonicity (11%). On average, 37% of debriefing questions were analyzed (range, 0% to 69%), and the results were used <50% of the time, usually to exclude respondents or interpret overall results. Researcher experience or confidence with DCEs did not affect their use of debriefing questions. Conclusions: These results suggest that although half of researchers conducting health DCEs use debriefing questions, many do not analyze, use, or report the responses. Given the additional respondent burden, there is a need for reliable and valid debriefing questions. In the meantime, the inclusion, analysis, and reporting of debriefing questions should be carefully considered before DCE implementation.

KW - debriefing questions

KW - discrete choice experiment

KW - patient preferences

KW - survey methodology

KW - theories of decision making

KW - understanding

UR - http://www.scopus.com/inward/record.url?scp=85076569190&partnerID=8YFLogxK

U2 - 10.1016/j.jval.2019.10.001

DO - 10.1016/j.jval.2019.10.001

M3 - Article

JO - Value in Health

JF - Value in Health

SN - 1098-3015

ER -