Use of an Online Crowdsourcing Platform to Assess Patient Comprehension of Radiology Reports and Colloquialisms

Abstract [from journal]

Objective: The purpose of this study was to use an online crowdsourcing platform to assess patient comprehension of five radiology reporting templates and radiology colloquialisms. 

Materials and Methods: In this cross-sectional study, participants were surveyed as patient surrogates using a crowdsourcing platform. Two tasks were completed within two 48-hour time periods. For the first crowdsourcing task, each participant was randomly assigned a set of radiology reports in a constructed reporting template and subsequently tested for comprehension. For the second crowdsourcing task, each participant was randomly assigned a radiology colloquialism and asked to indicate whether the phrase indicated a normal, abnormal, or ambivalent finding. 

Results: A total of 203 participants enrolled for the first task and 1166 for the second within 48 hours of task publication. The payment totaled $31.96. Of 812 radiology reports read, 384 (47%) were correctly interpreted by the patient surrogates. Patient surrogates had higher rates of comprehension of reports written in the patient summary (57%, p < 0.001) and traditional unstructured in combination with patient summary (51%, p = 0.004) formats than in the traditional unstructured format (40%). Most of the patient surrogates (114/203 [56%]) expressed a preference for receiving a full radiology report via an electronic patient portal. Several radiology colloquialisms with modifiers such as "low," "underdistended," and "decompressed" had low rates of comprehension. 

Conclusion: Use of the crowdsourcing platform is an expeditious, cost-effective, and customizable tool for surveying laypeople in sentiment- or task-based research. Patient summaries can help increase patient comprehension of radiology reports. Radiology colloquialisms are likely to be misunderstood by patients.