Evaluating an AI Chatbot “Prostate Cancer Info” for Providing Quality Prostate Cancer Screening Information: Cross-Sectional Study
Abstract BackgroundGenerative artificial intelligence (AI) chatbots may be useful tools for supporting shared prostate cancer (PrCA) screening decisions, but the information produced by these tools sometimes lack quality or credibility. “Prostate Cancer Info” is a custom GPT c...
Saved in:
| Main Authors: | , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
JMIR Publications
2025-05-01
|
| Series: | JMIR Cancer |
| Online Access: | https://cancer.jmir.org/2025/1/e72522 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849763327154061312 |
|---|---|
| author | Otis L Owens Michael S Leonard |
| author_facet | Otis L Owens Michael S Leonard |
| author_sort | Otis L Owens |
| collection | DOAJ |
| description |
Abstract
BackgroundGenerative artificial intelligence (AI) chatbots may be useful tools for supporting shared prostate cancer (PrCA) screening decisions, but the information produced by these tools sometimes lack quality or credibility. “Prostate Cancer Info” is a custom GPT chatbot developed to provide plain-language PrCA information only from websites of key authorities on cancer and peer-reviewed literature.
ObjectiveThe objective of this paper was to evaluate the accuracy, completeness, and readability of Prostate Cancer Info’s responses to frequently asked PrCA screening questions.
MethodsA total of 23 frequently asked PrCA questions were individually input into Prostate Cancer Info. Responses were recorded in Microsoft Word and reviewed by 2 raters for their accuracy and completeness. Readability of content was determined by pasting responses into a web-based Flesch Kincaid Reading Ease Scores calculator.
ResultsResponses to all questions were accurate and culturally appropriate. In total, 17 of the 23 questions (74%) had complete responses. The average readability of responses was 64.5 (SD 8.7; written at an 8th-grade level).
ConclusionsGenerative AI chatbots, such as Prostate Cancer Info, are great starting places for learning about PrCA screening and preparing men to engage in shared decision-making but should not be used as independent sources of PrCA information because key information may be omitted. Men are encouraged to use these tools to complement information received from a health care provider. |
| format | Article |
| id | doaj-art-4f47a51a5ccf496ea773a730f8b5f903 |
| institution | DOAJ |
| issn | 2369-1999 |
| language | English |
| publishDate | 2025-05-01 |
| publisher | JMIR Publications |
| record_format | Article |
| series | JMIR Cancer |
| spelling | doaj-art-4f47a51a5ccf496ea773a730f8b5f9032025-08-20T03:05:26ZengJMIR PublicationsJMIR Cancer2369-19992025-05-0111e72522e7252210.2196/72522Evaluating an AI Chatbot “Prostate Cancer Info” for Providing Quality Prostate Cancer Screening Information: Cross-Sectional StudyOtis L Owenshttp://orcid.org/0000-0002-1023-1449Michael S Leonardhttp://orcid.org/0009-0003-6686-5656 Abstract BackgroundGenerative artificial intelligence (AI) chatbots may be useful tools for supporting shared prostate cancer (PrCA) screening decisions, but the information produced by these tools sometimes lack quality or credibility. “Prostate Cancer Info” is a custom GPT chatbot developed to provide plain-language PrCA information only from websites of key authorities on cancer and peer-reviewed literature. ObjectiveThe objective of this paper was to evaluate the accuracy, completeness, and readability of Prostate Cancer Info’s responses to frequently asked PrCA screening questions. MethodsA total of 23 frequently asked PrCA questions were individually input into Prostate Cancer Info. Responses were recorded in Microsoft Word and reviewed by 2 raters for their accuracy and completeness. Readability of content was determined by pasting responses into a web-based Flesch Kincaid Reading Ease Scores calculator. ResultsResponses to all questions were accurate and culturally appropriate. In total, 17 of the 23 questions (74%) had complete responses. The average readability of responses was 64.5 (SD 8.7; written at an 8th-grade level). ConclusionsGenerative AI chatbots, such as Prostate Cancer Info, are great starting places for learning about PrCA screening and preparing men to engage in shared decision-making but should not be used as independent sources of PrCA information because key information may be omitted. Men are encouraged to use these tools to complement information received from a health care provider.https://cancer.jmir.org/2025/1/e72522 |
| spellingShingle | Otis L Owens Michael S Leonard Evaluating an AI Chatbot “Prostate Cancer Info” for Providing Quality Prostate Cancer Screening Information: Cross-Sectional Study JMIR Cancer |
| title | Evaluating an AI Chatbot “Prostate Cancer Info” for Providing Quality Prostate Cancer Screening Information: Cross-Sectional Study |
| title_full | Evaluating an AI Chatbot “Prostate Cancer Info” for Providing Quality Prostate Cancer Screening Information: Cross-Sectional Study |
| title_fullStr | Evaluating an AI Chatbot “Prostate Cancer Info” for Providing Quality Prostate Cancer Screening Information: Cross-Sectional Study |
| title_full_unstemmed | Evaluating an AI Chatbot “Prostate Cancer Info” for Providing Quality Prostate Cancer Screening Information: Cross-Sectional Study |
| title_short | Evaluating an AI Chatbot “Prostate Cancer Info” for Providing Quality Prostate Cancer Screening Information: Cross-Sectional Study |
| title_sort | evaluating an ai chatbot prostate cancer info for providing quality prostate cancer screening information cross sectional study |
| url | https://cancer.jmir.org/2025/1/e72522 |
| work_keys_str_mv | AT otislowens evaluatinganaichatbotprostatecancerinfoforprovidingqualityprostatecancerscreeninginformationcrosssectionalstudy AT michaelsleonard evaluatinganaichatbotprostatecancerinfoforprovidingqualityprostatecancerscreeninginformationcrosssectionalstudy |