Respondent Understanding in Discrete Choice Experiments: A Scoping Review

Alison Pearce* (Corresponding Author), Mark Harrison, Verity Watson, Deborah J Street, Kirsten Howard, Nick Bansback, Stirling Bryan

*Corresponding author for this work

Research output: Contribution to journalReview articlepeer-review

Abstract

INTRODUCTION: Despite the recognised importance of participant understanding for valid and reliable discrete choice experiment (DCE) results, there has been limited assessment of whether, and how, people understand DCEs, and how 'understanding' is conceptualised in DCEs applied to a health context.

OBJECTIVES: Our aim was to identify how participant understanding is conceptualised in the DCE literature in a health context. Our research questions addressed how participant understanding is defined, measured, and used.

METHODS: Searches were conducted (June 2019) in the MEDLINE, EMBASE, PsychINFO and Econlit databases, as well as hand searching. Search terms were based on previous DCE systematic reviews, with additional understanding keywords used in a proximity-based search strategy. Eligible studies were peer-reviewed journal articles in the field of health, related to DCE or best-worst scaling type 3 (BWS3) studies, and reporting some consideration or assessment of participant understanding. A descriptive analytical approach was used to chart relevant data from each study, including publication year, country, clinical area, subject group, sample size, study design, numbers of attributes, levels and choice sets, definition of understanding, how understanding was tested, results of the understanding tests, and how the information about understanding was used. Each study was categorised based on how understanding was conceptualised and used within the study.

RESULTS: Of 306 potentially eligible articles identified, 31 were excluded based on titles and abstracts, and 200 were excluded on full-text review, resulting in 75 included studies. Three categories of study were identified: applied DCEs (n = 52), pretesting studies (n = 7) and studies of understanding (n = 16). Typically, understanding was defined in relation to either the choice context, such as attribute terminology, or the concept of choosing. Very few studies considered respondents' engagement as a component of understanding. Understanding was measured primarily through qualitative pretesting, rationality or validity tests included in the survey, and participant self-report, however reporting and use of the results of these methods was inconsistent.

CONCLUSIONS: Those conducting or using health DCEs should carefully select, justify, and report the measurement and potential impact of participant understanding in their specific choice context. There remains scope for research into the different components of participant understanding, particularly related to engagement, the impact of participant understanding on DCE validity and reliability, the best measures of understanding, and methods to maximise participant understanding.

Original languageEnglish
Pages (from-to)17-53
Number of pages37
JournalThe Patient - Patient-Centered Outcomes Research
Volume14
Early online date3 Nov 2020
DOIs
Publication statusPublished - Jan 2021

Keywords

  • ASTHMA SERVICES
  • ATTRIBUTES
  • CANCER
  • CONJOINT-ANALYSIS
  • DESIGN
  • HEALTH
  • PATIENT PREFERENCES
  • RISK
  • THINK ALOUD
  • TYPE-2 DIABETES-MELLITUS

Fingerprint Dive into the research topics of 'Respondent Understanding in Discrete Choice Experiments: A Scoping Review'. Together they form a unique fingerprint.

Cite this