Detecting Deception and Ensuring Data Integrity in a Nationwide mHealth Randomized Controlled Trial: Factorial Design Survey Study
BackgroundSocial behavioral research studies have increasingly shifted to remote recruitment and enrollment procedures. This shifting landscape necessitates evolving best practices to help mitigate the negative impacts of deceptive attempts (eg, fake profiles and bots) at enr...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
JMIR Publications
2025-01-01
|
Series: | Journal of Medical Internet Research |
Online Access: | https://www.jmir.org/2025/1/e66384 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1832583308628721664 |
---|---|
author | Krista M Kezbers Michael C Robertson Emily T Hébert Audrey Montgomery Michael S Businelle |
author_facet | Krista M Kezbers Michael C Robertson Emily T Hébert Audrey Montgomery Michael S Businelle |
author_sort | Krista M Kezbers |
collection | DOAJ |
description |
BackgroundSocial behavioral research studies have increasingly shifted to remote recruitment and enrollment procedures. This shifting landscape necessitates evolving best practices to help mitigate the negative impacts of deceptive attempts (eg, fake profiles and bots) at enrolling in behavioral research.
ObjectiveThis study aimed to develop and implement robust deception detection procedures during the enrollment period of a remotely conducted randomized controlled trial.
MethodsA 32-group (2×2×2×2×2) factorial design study was conducted from November 2021 to September 2022 to identify mobile health (mHealth) survey design features associated with the highest completion rates of smartphone-based ecological momentary assessments (n=485). Participants were required to be at least 18 years old, live in the United States, and own an Android smartphone that was compatible with the Insight app that was used in the study. Recruitment was conducted remotely through Facebook advertisements, a 5-minute REDCap (Research Electronic Data Capture) prescreener, and a screening and enrollment phone call. The research team created and implemented a 12-step checklist (eg, address verification and texting a copy of picture identification) to identify and prevent potentially deceptive attempts to enroll in the study. Descriptive statistics were calculated to understand the prevalence of various types of deceptive attempts at study enrollment.
ResultsFacebook advertisements resulted in 5236 initiations of the REDCap prescreener. A digital deception detection procedure was implemented for those who were deemed pre-eligible (n=1928). This procedure resulted in 26% (501/1928) of prescreeners being flagged as potentially deceptive. Completing multiple prescreeners (301/501, 60.1%) and providing invalid addresses (156/501, 31.1%) were the most common reasons prescreeners were flagged. An additional 1% (18/1928) of prescreeners were flagged as potentially deceptive during the subsequent study screening and enrollment phone call. Reasons for exclusion at the screening and enrollment phone call level included having an invalid phone type (6/18, 33.3%), completing multiple prescreeners (6/18, 33.3%), and providing an invalid address (5/18, 27.7%). This resulted in 1409 individuals being eligible after all deception checks were completed. Postenrollment social security number checks revealed that 3 (0.6%) fully enrolled participants out of 485 provided erroneous social security numbers during the screening process.
ConclusionsImplementation of a deception detection procedure in a remotely conducted randomized controlled trial resulted in a substantial proportion of cases being flagged as potentially engaging in deceptive attempts at study enrollment. The results of the deception detection procedures in this study confirmed the need for vigilance in conducting remote behavioral research in order to maintain data integrity. Implementing systematic deception detection procedures may support study administration, data quality, and participant safety in remotely conducted behavioral research.
Trial RegistrationClinicalTrials.gov NCT05194228; https://clinicaltrials.gov/study/NCT05194228 |
format | Article |
id | doaj-art-7fdfd2d81cc6487ab46a6f2f88b2f1bc |
institution | Kabale University |
issn | 1438-8871 |
language | English |
publishDate | 2025-01-01 |
publisher | JMIR Publications |
record_format | Article |
series | Journal of Medical Internet Research |
spelling | doaj-art-7fdfd2d81cc6487ab46a6f2f88b2f1bc2025-01-28T19:15:30ZengJMIR PublicationsJournal of Medical Internet Research1438-88712025-01-0127e6638410.2196/66384Detecting Deception and Ensuring Data Integrity in a Nationwide mHealth Randomized Controlled Trial: Factorial Design Survey StudyKrista M Kezbershttps://orcid.org/0000-0002-8387-2781Michael C Robertsonhttps://orcid.org/0000-0002-2240-014XEmily T Héberthttps://orcid.org/0000-0001-5922-164XAudrey Montgomeryhttps://orcid.org/0000-0002-3468-6800Michael S Businellehttps://orcid.org/0000-0002-9038-2238 BackgroundSocial behavioral research studies have increasingly shifted to remote recruitment and enrollment procedures. This shifting landscape necessitates evolving best practices to help mitigate the negative impacts of deceptive attempts (eg, fake profiles and bots) at enrolling in behavioral research. ObjectiveThis study aimed to develop and implement robust deception detection procedures during the enrollment period of a remotely conducted randomized controlled trial. MethodsA 32-group (2×2×2×2×2) factorial design study was conducted from November 2021 to September 2022 to identify mobile health (mHealth) survey design features associated with the highest completion rates of smartphone-based ecological momentary assessments (n=485). Participants were required to be at least 18 years old, live in the United States, and own an Android smartphone that was compatible with the Insight app that was used in the study. Recruitment was conducted remotely through Facebook advertisements, a 5-minute REDCap (Research Electronic Data Capture) prescreener, and a screening and enrollment phone call. The research team created and implemented a 12-step checklist (eg, address verification and texting a copy of picture identification) to identify and prevent potentially deceptive attempts to enroll in the study. Descriptive statistics were calculated to understand the prevalence of various types of deceptive attempts at study enrollment. ResultsFacebook advertisements resulted in 5236 initiations of the REDCap prescreener. A digital deception detection procedure was implemented for those who were deemed pre-eligible (n=1928). This procedure resulted in 26% (501/1928) of prescreeners being flagged as potentially deceptive. Completing multiple prescreeners (301/501, 60.1%) and providing invalid addresses (156/501, 31.1%) were the most common reasons prescreeners were flagged. An additional 1% (18/1928) of prescreeners were flagged as potentially deceptive during the subsequent study screening and enrollment phone call. Reasons for exclusion at the screening and enrollment phone call level included having an invalid phone type (6/18, 33.3%), completing multiple prescreeners (6/18, 33.3%), and providing an invalid address (5/18, 27.7%). This resulted in 1409 individuals being eligible after all deception checks were completed. Postenrollment social security number checks revealed that 3 (0.6%) fully enrolled participants out of 485 provided erroneous social security numbers during the screening process. ConclusionsImplementation of a deception detection procedure in a remotely conducted randomized controlled trial resulted in a substantial proportion of cases being flagged as potentially engaging in deceptive attempts at study enrollment. The results of the deception detection procedures in this study confirmed the need for vigilance in conducting remote behavioral research in order to maintain data integrity. Implementing systematic deception detection procedures may support study administration, data quality, and participant safety in remotely conducted behavioral research. Trial RegistrationClinicalTrials.gov NCT05194228; https://clinicaltrials.gov/study/NCT05194228https://www.jmir.org/2025/1/e66384 |
spellingShingle | Krista M Kezbers Michael C Robertson Emily T Hébert Audrey Montgomery Michael S Businelle Detecting Deception and Ensuring Data Integrity in a Nationwide mHealth Randomized Controlled Trial: Factorial Design Survey Study Journal of Medical Internet Research |
title | Detecting Deception and Ensuring Data Integrity in a Nationwide mHealth Randomized Controlled Trial: Factorial Design Survey Study |
title_full | Detecting Deception and Ensuring Data Integrity in a Nationwide mHealth Randomized Controlled Trial: Factorial Design Survey Study |
title_fullStr | Detecting Deception and Ensuring Data Integrity in a Nationwide mHealth Randomized Controlled Trial: Factorial Design Survey Study |
title_full_unstemmed | Detecting Deception and Ensuring Data Integrity in a Nationwide mHealth Randomized Controlled Trial: Factorial Design Survey Study |
title_short | Detecting Deception and Ensuring Data Integrity in a Nationwide mHealth Randomized Controlled Trial: Factorial Design Survey Study |
title_sort | detecting deception and ensuring data integrity in a nationwide mhealth randomized controlled trial factorial design survey study |
url | https://www.jmir.org/2025/1/e66384 |
work_keys_str_mv | AT kristamkezbers detectingdeceptionandensuringdataintegrityinanationwidemhealthrandomizedcontrolledtrialfactorialdesignsurveystudy AT michaelcrobertson detectingdeceptionandensuringdataintegrityinanationwidemhealthrandomizedcontrolledtrialfactorialdesignsurveystudy AT emilythebert detectingdeceptionandensuringdataintegrityinanationwidemhealthrandomizedcontrolledtrialfactorialdesignsurveystudy AT audreymontgomery detectingdeceptionandensuringdataintegrityinanationwidemhealthrandomizedcontrolledtrialfactorialdesignsurveystudy AT michaelsbusinelle detectingdeceptionandensuringdataintegrityinanationwidemhealthrandomizedcontrolledtrialfactorialdesignsurveystudy |