Comparison of Answers between ChatGPT and Human Dieticians to Common Nutrition Questions

Background. More people than ever seek nutrition information from online sources. The chatbot ChatGPT has seen staggering popularity since its inception and may become a resource for information in nutrition. However, the adequacy of ChatGPT to answer questions in the field of nutrition has not been...

Full description

Saved in:
Bibliographic Details
Main Authors: Daniel Kirk, Elise van Eijnatten, Guido Camps
Format: Article
Language:English
Published: Wiley 2023-01-01
Series:Journal of Nutrition and Metabolism
Online Access:http://dx.doi.org/10.1155/2023/5548684
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849434650375618560
author Daniel Kirk
Elise van Eijnatten
Guido Camps
author_facet Daniel Kirk
Elise van Eijnatten
Guido Camps
author_sort Daniel Kirk
collection DOAJ
description Background. More people than ever seek nutrition information from online sources. The chatbot ChatGPT has seen staggering popularity since its inception and may become a resource for information in nutrition. However, the adequacy of ChatGPT to answer questions in the field of nutrition has not been investigated. Thus, the aim of this research was to investigate the competency of ChatGPT in answering common nutrition questions. Methods. Dieticians were asked to provide their most commonly asked nutrition questions and their own answers to them. We then asked the same questions to ChatGPT and sent both sets of answers to other dieticians (N = 18) or nutritionists and experts in the domain of each question (N = 9) to be graded based on scientific correctness, actionability, and comprehensibility. The grades were also averaged to give an overall score, and group means of the answers to each question were compared using permutation tests. Results. The overall grades for ChatGPT were higher than those from the dieticians for the overall scores in five of the eight questions we received. ChatGPT also had higher grades on five occasions for scientific correctness, four for actionability, and five for comprehensibility. In contrast, none of the answers from the dieticians had a higher average score than ChatGPT for any of the questions, both overall and for each of the grading components. Conclusions. Our results suggest that ChatGPT can be used to answer nutrition questions that are frequently asked to dieticians and provide encouraging support for the role of chatbots in offering nutrition support.
format Article
id doaj-art-e9d6ca01e8d44eca98eb3a3921b1c20e
institution Kabale University
issn 2090-0732
language English
publishDate 2023-01-01
publisher Wiley
record_format Article
series Journal of Nutrition and Metabolism
spelling doaj-art-e9d6ca01e8d44eca98eb3a3921b1c20e2025-08-20T03:26:34ZengWileyJournal of Nutrition and Metabolism2090-07322023-01-01202310.1155/2023/5548684Comparison of Answers between ChatGPT and Human Dieticians to Common Nutrition QuestionsDaniel Kirk0Elise van Eijnatten1Guido Camps2Division of Human Nutrition and HealthDivision of Human Nutrition and HealthDivision of Human Nutrition and HealthBackground. More people than ever seek nutrition information from online sources. The chatbot ChatGPT has seen staggering popularity since its inception and may become a resource for information in nutrition. However, the adequacy of ChatGPT to answer questions in the field of nutrition has not been investigated. Thus, the aim of this research was to investigate the competency of ChatGPT in answering common nutrition questions. Methods. Dieticians were asked to provide their most commonly asked nutrition questions and their own answers to them. We then asked the same questions to ChatGPT and sent both sets of answers to other dieticians (N = 18) or nutritionists and experts in the domain of each question (N = 9) to be graded based on scientific correctness, actionability, and comprehensibility. The grades were also averaged to give an overall score, and group means of the answers to each question were compared using permutation tests. Results. The overall grades for ChatGPT were higher than those from the dieticians for the overall scores in five of the eight questions we received. ChatGPT also had higher grades on five occasions for scientific correctness, four for actionability, and five for comprehensibility. In contrast, none of the answers from the dieticians had a higher average score than ChatGPT for any of the questions, both overall and for each of the grading components. Conclusions. Our results suggest that ChatGPT can be used to answer nutrition questions that are frequently asked to dieticians and provide encouraging support for the role of chatbots in offering nutrition support.http://dx.doi.org/10.1155/2023/5548684
spellingShingle Daniel Kirk
Elise van Eijnatten
Guido Camps
Comparison of Answers between ChatGPT and Human Dieticians to Common Nutrition Questions
Journal of Nutrition and Metabolism
title Comparison of Answers between ChatGPT and Human Dieticians to Common Nutrition Questions
title_full Comparison of Answers between ChatGPT and Human Dieticians to Common Nutrition Questions
title_fullStr Comparison of Answers between ChatGPT and Human Dieticians to Common Nutrition Questions
title_full_unstemmed Comparison of Answers between ChatGPT and Human Dieticians to Common Nutrition Questions
title_short Comparison of Answers between ChatGPT and Human Dieticians to Common Nutrition Questions
title_sort comparison of answers between chatgpt and human dieticians to common nutrition questions
url http://dx.doi.org/10.1155/2023/5548684
work_keys_str_mv AT danielkirk comparisonofanswersbetweenchatgptandhumandieticianstocommonnutritionquestions
AT elisevaneijnatten comparisonofanswersbetweenchatgptandhumandieticianstocommonnutritionquestions
AT guidocamps comparisonofanswersbetweenchatgptandhumandieticianstocommonnutritionquestions