Exploring the potential of artificial intelligence chatbots in prosthodontics education
Abstract Background The purpose of this study was to evaluate the performance of widely used artificial intelligence (AI) chatbots in answering prosthodontics questions from the Dentistry Specialization Residency Examination (DSRE). Methods A total of 126 DSRE prosthodontics questions were divided i...
Saved in:
| Main Authors: | , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
BMC
2025-02-01
|
| Series: | BMC Medical Education |
| Subjects: | |
| Online Access: | https://doi.org/10.1186/s12909-025-06849-w |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849766780932718592 |
|---|---|
| author | Ravza Eraslan Mustafa Ayata Filiz Yagci Haydar Albayrak |
| author_facet | Ravza Eraslan Mustafa Ayata Filiz Yagci Haydar Albayrak |
| author_sort | Ravza Eraslan |
| collection | DOAJ |
| description | Abstract Background The purpose of this study was to evaluate the performance of widely used artificial intelligence (AI) chatbots in answering prosthodontics questions from the Dentistry Specialization Residency Examination (DSRE). Methods A total of 126 DSRE prosthodontics questions were divided into seven subtopics (dental morphology, materials science, fixed dentures, removable partial dentures, complete dentures, occlusion/temporomandibular joint, and dental implantology). Questions were translated into English by the authors, and this version of the questions were asked to five chatbots (ChatGPT-3.5, Gemini Advanced, Claude Pro, Microsoft Copilot, and Perplexity) within a 7-day period. Statistical analyses, including chi-square and z-tests, were performed to compare accuracy rates across the chatbots and subtopics at a significance level of 0.05. Results The overall accuracy rates for the chatbots were as follows: Copilot (73%), Gemini (63.5%), ChatGPT-3.5 (61.1%), Claude Pro (57.9%), and Perplexity (54.8%). Copilot significantly outperformed Perplexity (P = 0.035). However, no significant differences in accuracy were found across subtopics among chatbots. Questions on dental implantology had the highest accuracy rate (75%), while questions on removable partial dentures had the lowest (50.8%). Conclusion Copilot showed the highest accuracy rate (73%), significantly outperforming Perplexity (54.8%). AI models demonstrate potential as educational support tools but currently face limitations in serving as reliable educational tools across all areas of prosthodontics. Future advancements in AI may lead to better integration and more effective use in dental education. |
| format | Article |
| id | doaj-art-bede5aba45e04db0888b1deb3c847953 |
| institution | DOAJ |
| issn | 1472-6920 |
| language | English |
| publishDate | 2025-02-01 |
| publisher | BMC |
| record_format | Article |
| series | BMC Medical Education |
| spelling | doaj-art-bede5aba45e04db0888b1deb3c8479532025-08-20T03:04:29ZengBMCBMC Medical Education1472-69202025-02-012511810.1186/s12909-025-06849-wExploring the potential of artificial intelligence chatbots in prosthodontics educationRavza Eraslan0Mustafa Ayata1Filiz Yagci2Haydar Albayrak3Department of Prosthodontics, Faculty of Dentistry, Erciyes UniversityPrivate Practice, Ortoperio Oral and Dental Health PolyclinicDepartment of Prosthodontics, Faculty of Dentistry, Erciyes UniversityDepartment of Prosthodontics, Faculty of Dentistry, Erciyes UniversityAbstract Background The purpose of this study was to evaluate the performance of widely used artificial intelligence (AI) chatbots in answering prosthodontics questions from the Dentistry Specialization Residency Examination (DSRE). Methods A total of 126 DSRE prosthodontics questions were divided into seven subtopics (dental morphology, materials science, fixed dentures, removable partial dentures, complete dentures, occlusion/temporomandibular joint, and dental implantology). Questions were translated into English by the authors, and this version of the questions were asked to five chatbots (ChatGPT-3.5, Gemini Advanced, Claude Pro, Microsoft Copilot, and Perplexity) within a 7-day period. Statistical analyses, including chi-square and z-tests, were performed to compare accuracy rates across the chatbots and subtopics at a significance level of 0.05. Results The overall accuracy rates for the chatbots were as follows: Copilot (73%), Gemini (63.5%), ChatGPT-3.5 (61.1%), Claude Pro (57.9%), and Perplexity (54.8%). Copilot significantly outperformed Perplexity (P = 0.035). However, no significant differences in accuracy were found across subtopics among chatbots. Questions on dental implantology had the highest accuracy rate (75%), while questions on removable partial dentures had the lowest (50.8%). Conclusion Copilot showed the highest accuracy rate (73%), significantly outperforming Perplexity (54.8%). AI models demonstrate potential as educational support tools but currently face limitations in serving as reliable educational tools across all areas of prosthodontics. Future advancements in AI may lead to better integration and more effective use in dental education.https://doi.org/10.1186/s12909-025-06849-wProsthodontics educationArtificial intelligence applicationsDentistry specializationAI chatbot evaluationClinical decision-support systems |
| spellingShingle | Ravza Eraslan Mustafa Ayata Filiz Yagci Haydar Albayrak Exploring the potential of artificial intelligence chatbots in prosthodontics education BMC Medical Education Prosthodontics education Artificial intelligence applications Dentistry specialization AI chatbot evaluation Clinical decision-support systems |
| title | Exploring the potential of artificial intelligence chatbots in prosthodontics education |
| title_full | Exploring the potential of artificial intelligence chatbots in prosthodontics education |
| title_fullStr | Exploring the potential of artificial intelligence chatbots in prosthodontics education |
| title_full_unstemmed | Exploring the potential of artificial intelligence chatbots in prosthodontics education |
| title_short | Exploring the potential of artificial intelligence chatbots in prosthodontics education |
| title_sort | exploring the potential of artificial intelligence chatbots in prosthodontics education |
| topic | Prosthodontics education Artificial intelligence applications Dentistry specialization AI chatbot evaluation Clinical decision-support systems |
| url | https://doi.org/10.1186/s12909-025-06849-w |
| work_keys_str_mv | AT ravzaeraslan exploringthepotentialofartificialintelligencechatbotsinprosthodonticseducation AT mustafaayata exploringthepotentialofartificialintelligencechatbotsinprosthodonticseducation AT filizyagci exploringthepotentialofartificialintelligencechatbotsinprosthodonticseducation AT haydaralbayrak exploringthepotentialofartificialintelligencechatbotsinprosthodonticseducation |