Chatbot -assisted self-assessment (CASA): Co-designing an AI -powered behaviour change intervention for ethnic minorities.

<h4>Background</h4>The digitalisation of healthcare has provided new ways to address disparities in sexual health outcomes that particularly affect ethnic and sexual minorities. Conversational artificial intelligence (AI) chatbots can provide personalised health education and refer users...

Full description

Saved in:
Bibliographic Details
Main Authors: Tom Nadarzynski, Nicky Knights, Deborah Husbands, Cynthia Graham, Carrie D Llewellyn, Tom Buchanan, Ian Montgomery, Alejandra Soruco Rodriguez, Chimeremumma Ogueri, Nidhi Singh, Evan Rouse, Olabisi Oyebode, Ankit Das, Grace Paydon, Gurpreet Lall, Anathoth Bulukungu, Nur Yanyali, Alexandra Stefan, Damien Ridge
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2025-02-01
Series:PLOS Digital Health
Online Access:https://doi.org/10.1371/journal.pdig.0000724
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850075314360680448
author Tom Nadarzynski
Nicky Knights
Deborah Husbands
Cynthia Graham
Carrie D Llewellyn
Tom Buchanan
Ian Montgomery
Alejandra Soruco Rodriguez
Chimeremumma Ogueri
Nidhi Singh
Evan Rouse
Olabisi Oyebode
Ankit Das
Grace Paydon
Gurpreet Lall
Anathoth Bulukungu
Nur Yanyali
Alexandra Stefan
Damien Ridge
author_facet Tom Nadarzynski
Nicky Knights
Deborah Husbands
Cynthia Graham
Carrie D Llewellyn
Tom Buchanan
Ian Montgomery
Alejandra Soruco Rodriguez
Chimeremumma Ogueri
Nidhi Singh
Evan Rouse
Olabisi Oyebode
Ankit Das
Grace Paydon
Gurpreet Lall
Anathoth Bulukungu
Nur Yanyali
Alexandra Stefan
Damien Ridge
author_sort Tom Nadarzynski
collection DOAJ
description <h4>Background</h4>The digitalisation of healthcare has provided new ways to address disparities in sexual health outcomes that particularly affect ethnic and sexual minorities. Conversational artificial intelligence (AI) chatbots can provide personalised health education and refer users for appropriate medical consultations. We aimed to explore design principles of a chatbot-assisted culturally sensitive self-assessment intervention based on the disclosure of health-related information.<h4>Methods</h4>In 2022, an online survey was conducted among an ethnically diverse UK sample (N = 1,287) to identify the level and type of health-related information disclosure to sexual health chatbots, and reactions to chatbots' risk appraisal. Follow-up interviews (N = 41) further explored perceptions of chatbot-led health assessment to identify aspects related to acceptability and utilisation. Datasets were analysed using one-way ANOVAs, linear regression, and thematic analysis.<h4>Results</h4>Participants had neutral-to-positive attitudes towards chatbots and were comfortable disclosing demographic and sensitive health information. Chatbot awareness, previous experience and positive attitudes towards chatbots predicted information disclosure. Qualitatively, four main themes were identified: "Chatbot as an artificial health advisor", "Disclosing information to a chatbot", "Ways to facilitate trust and disclosure", and "Acting on self-assessment".<h4>Conclusion</h4>Chatbots were acceptable for health self-assessment among this sample of ethnically diverse individuals. Most users reported being comfortable disclosing sensitive and personal information, but user anonymity is key to engagement with chatbots. As this technology becomes more advanced and widely available, chatbots could potentially become supplementary tools for health education and screening eligibility assessment. Future research is needed to establish their impact on screening uptake and access to health services among minoritised communities.
format Article
id doaj-art-add608e227b442a0bc21cf3b79b8fbf0
institution DOAJ
issn 2767-3170
language English
publishDate 2025-02-01
publisher Public Library of Science (PLoS)
record_format Article
series PLOS Digital Health
spelling doaj-art-add608e227b442a0bc21cf3b79b8fbf02025-08-20T02:46:20ZengPublic Library of Science (PLoS)PLOS Digital Health2767-31702025-02-0142e000072410.1371/journal.pdig.0000724Chatbot -assisted self-assessment (CASA): Co-designing an AI -powered behaviour change intervention for ethnic minorities.Tom NadarzynskiNicky KnightsDeborah HusbandsCynthia GrahamCarrie D LlewellynTom BuchananIan MontgomeryAlejandra Soruco RodriguezChimeremumma OgueriNidhi SinghEvan RouseOlabisi OyebodeAnkit DasGrace PaydonGurpreet LallAnathoth BulukunguNur YanyaliAlexandra StefanDamien Ridge<h4>Background</h4>The digitalisation of healthcare has provided new ways to address disparities in sexual health outcomes that particularly affect ethnic and sexual minorities. Conversational artificial intelligence (AI) chatbots can provide personalised health education and refer users for appropriate medical consultations. We aimed to explore design principles of a chatbot-assisted culturally sensitive self-assessment intervention based on the disclosure of health-related information.<h4>Methods</h4>In 2022, an online survey was conducted among an ethnically diverse UK sample (N = 1,287) to identify the level and type of health-related information disclosure to sexual health chatbots, and reactions to chatbots' risk appraisal. Follow-up interviews (N = 41) further explored perceptions of chatbot-led health assessment to identify aspects related to acceptability and utilisation. Datasets were analysed using one-way ANOVAs, linear regression, and thematic analysis.<h4>Results</h4>Participants had neutral-to-positive attitudes towards chatbots and were comfortable disclosing demographic and sensitive health information. Chatbot awareness, previous experience and positive attitudes towards chatbots predicted information disclosure. Qualitatively, four main themes were identified: "Chatbot as an artificial health advisor", "Disclosing information to a chatbot", "Ways to facilitate trust and disclosure", and "Acting on self-assessment".<h4>Conclusion</h4>Chatbots were acceptable for health self-assessment among this sample of ethnically diverse individuals. Most users reported being comfortable disclosing sensitive and personal information, but user anonymity is key to engagement with chatbots. As this technology becomes more advanced and widely available, chatbots could potentially become supplementary tools for health education and screening eligibility assessment. Future research is needed to establish their impact on screening uptake and access to health services among minoritised communities.https://doi.org/10.1371/journal.pdig.0000724
spellingShingle Tom Nadarzynski
Nicky Knights
Deborah Husbands
Cynthia Graham
Carrie D Llewellyn
Tom Buchanan
Ian Montgomery
Alejandra Soruco Rodriguez
Chimeremumma Ogueri
Nidhi Singh
Evan Rouse
Olabisi Oyebode
Ankit Das
Grace Paydon
Gurpreet Lall
Anathoth Bulukungu
Nur Yanyali
Alexandra Stefan
Damien Ridge
Chatbot -assisted self-assessment (CASA): Co-designing an AI -powered behaviour change intervention for ethnic minorities.
PLOS Digital Health
title Chatbot -assisted self-assessment (CASA): Co-designing an AI -powered behaviour change intervention for ethnic minorities.
title_full Chatbot -assisted self-assessment (CASA): Co-designing an AI -powered behaviour change intervention for ethnic minorities.
title_fullStr Chatbot -assisted self-assessment (CASA): Co-designing an AI -powered behaviour change intervention for ethnic minorities.
title_full_unstemmed Chatbot -assisted self-assessment (CASA): Co-designing an AI -powered behaviour change intervention for ethnic minorities.
title_short Chatbot -assisted self-assessment (CASA): Co-designing an AI -powered behaviour change intervention for ethnic minorities.
title_sort chatbot assisted self assessment casa co designing an ai powered behaviour change intervention for ethnic minorities
url https://doi.org/10.1371/journal.pdig.0000724
work_keys_str_mv AT tomnadarzynski chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT nickyknights chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT deborahhusbands chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT cynthiagraham chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT carriedllewellyn chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT tombuchanan chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT ianmontgomery chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT alejandrasorucorodriguez chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT chimeremummaogueri chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT nidhisingh chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT evanrouse chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT olabisioyebode chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT ankitdas chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT gracepaydon chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT gurpreetlall chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT anathothbulukungu chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT nuryanyali chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT alexandrastefan chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities
AT damienridge chatbotassistedselfassessmentcasacodesigninganaipoweredbehaviourchangeinterventionforethnicminorities