According to Gram Research analysis, ChatGPT helped medical students create the most complete and well-organized counseling plans for childhood obesity, scoring highest on a standardized rubric, while Google Search and videos were moderately effective and scientific papers were least helpful. A 2026 study of 359 medical students found that all information sources successfully taught students about childhood obesity and the Planetary Health Diet, but AI tools organized information most effectively when combined with teacher guidance on critical evaluation.
Researchers studied how 359 medical students learned about childhood obesity and sustainable eating using different information sources—ChatGPT, Google Search, scientific papers, and instructional videos. Students worked in small groups to develop counseling approaches for families dealing with childhood obesity. The study found that all sources helped students learn, but ChatGPT provided the most complete and organized information, while scientific papers were the least helpful. The research shows that teaching communication skills alongside nutrition knowledge helps prepare future doctors to have meaningful conversations with families about weight and health.
Key Statistics
A 2026 proof-of-concept study of 359 second-year medical students found that ChatGPT achieved the highest conformity scores with ideal counseling solutions for childhood obesity, followed by Google Search and instructional videos, while scientific papers achieved the lowest scores.
According to research reviewed by Gram, before taking a communication-based course, most of the 359 medical students reported limited knowledge of the Planetary Health Diet and little practical experience counseling children with obesity and their families.
A 2026 study of 359 medical students found that all four information sources—ChatGPT, Google Search, scientific papers, and instructional videos—successfully enabled students to acquire relevant knowledge on childhood obesity and the Planetary Health Diet, though quality and depth varied significantly.
Research from 2026 involving 359 medical students showed that ChatGPT not only achieved the highest scores but also provided the most additional relevant information beyond basic requirements compared to other information sources.
The Quick Take
- What they studied: How different ways of finding information (ChatGPT, Google, research papers, and videos) help medical students learn about childhood obesity and the Planetary Health Diet, which connects personal health with environmental sustainability.
- Who participated: 359 second-year medical students at a university during the 2023-2024 school year. Students were divided into small groups, with each group using only one assigned information source.
- Key finding: ChatGPT helped students create the most complete and well-organized counseling plans, followed by Google Search and videos. Scientific papers were the least effective tool. However, all sources successfully taught students about childhood obesity and sustainable eating.
- What it means for you: Medical schools may benefit from using AI tools and digital resources to teach complex topics, but teachers need to guide students on how to evaluate information critically and understand that real patient situations are more complicated than any single source suggests.
The Research Details
This was a proof-of-concept study, which means researchers tested a new teaching approach to see if it worked before using it more widely. The study took place during a mandatory communication class for second-year medical students. Before the class started, students answered questions about what they already knew about childhood obesity and the Planetary Health Diet. Then they received a brief lesson on the topic. Next, students were divided into small groups, and each group was assigned to use only one information source: ChatGPT (an AI chatbot), Google Search, scientific research papers, or instructional videos. Using their assigned source, groups prepared a counseling conversation they would have with a family dealing with childhood obesity. Researchers then scored each group’s work using a detailed rubric based on an ideal answer, and they also analyzed the content of what students wrote to identify themes and patterns.
This research approach is important because it directly compares how different information sources affect what medical students learn and how they apply that knowledge. By having students work on a realistic scenario (counseling a family), researchers could see not just whether students learned facts, but whether they could actually use that knowledge in a practical situation. This matters because future doctors need both knowledge and communication skills to help families with childhood obesity.
This study has several strengths: it included a large number of students (359), used a standardized scenario so all groups faced the same challenge, and combined two types of analysis (scoring and content analysis) to understand results. However, this was a proof-of-concept study, meaning it was designed to test whether the teaching approach works, not to provide final answers. The study was conducted at one university during one academic year, so results may not apply everywhere. Students were all second-year medical students, so findings may differ for students at other training levels. The study didn’t follow students long-term to see if they remembered what they learned or if it changed how they practice medicine.
What the Results Show
All four information sources—ChatGPT, Google Search, scientific papers, and instructional videos—successfully helped students learn about childhood obesity and the Planetary Health Diet. However, the quality and completeness of what students learned varied significantly depending on which source they used. The ChatGPT group achieved the highest scores on the standardized rubric, meaning their counseling plans were most similar to the ideal answer. These students also provided the most additional relevant information beyond the basic requirements. The Google Search group came in second, followed by the video group. The scientific papers group scored the lowest, suggesting that research papers alone were the least effective tool for this learning task. Despite these differences, all groups demonstrated that they had acquired meaningful knowledge about the topic and could apply it to a realistic counseling scenario.
Before taking the course, most students reported having limited knowledge about the Planetary Health Diet, which connects eating choices to environmental sustainability. Students also said they had little practical experience counseling children with obesity and their families. This suggests that communication-based teaching—where students practice real conversations—fills an important gap in medical education. The study also found that students using different sources structured their information differently. ChatGPT users tended to organize information in a more logical, step-by-step way that matched the ideal solution. This suggests that AI tools may help students not just find information, but organize it in ways that are useful for patient care.
This study builds on existing research showing that communication skills are crucial for doctors treating chronic conditions like obesity. Previous studies have shown that patients are more likely to follow health advice when doctors communicate with sensitivity and understanding. This research adds to that body of knowledge by showing that medical students can learn both the facts about nutrition and the communication skills needed to discuss weight with families. The study also reflects a growing trend in medical education: using AI tools and digital resources alongside traditional teaching methods. However, this appears to be one of the first studies directly comparing how ChatGPT, Google Search, papers, and videos affect medical student learning about a specific topic.
This study has several important limitations. First, it only measured what students learned during one class session, not whether they remembered the information weeks or months later. Second, the study didn’t follow students into their medical careers to see if this teaching approach actually changed how they treat patients. Third, all students were second-year medical students at one university, so results may not apply to students at other schools or at different training levels. Fourth, the study didn’t examine whether students actually understood the limitations of each information source or whether they could critically evaluate the quality of information they found. Finally, the study was conducted in 2023-2024, and AI tools like ChatGPT are changing rapidly, so results may not apply to newer versions of these tools.
The Bottom Line
Medical schools should consider using communication-based teaching to help students learn about complex topics like childhood obesity and sustainable nutrition. When using digital tools like ChatGPT and Google Search, teachers should provide explicit guidance on how to evaluate information quality and should encourage students to think critically about what they find. Teachers should also help students understand that real patient situations are more complicated than any single information source suggests. While AI tools like ChatGPT can be helpful for organizing information, they should not replace human teaching or critical thinking. (Confidence level: Moderate—this is a proof-of-concept study, so more research is needed before making major changes to medical education.)
Medical schools and educators should care about these findings because they suggest new ways to teach complex topics. Medical students should care because they may benefit from using multiple information sources and understanding how different tools can help them learn. Future patients should care because better-trained doctors who can communicate sensitively about weight and nutrition may provide better care. Healthcare administrators should care because this research suggests that digital tools can make medical education more effective. However, these findings are most relevant to medical education in developed countries with access to these digital tools.
Students showed improved knowledge and skills after just one class session, suggesting that communication-based teaching can produce relatively quick results. However, the study didn’t measure long-term retention, so it’s unclear how long students will remember this information. To see lasting changes in how doctors communicate with patients about obesity, follow-up studies tracking students into their medical careers would be needed. Realistically, medical schools should expect that one class session introduces students to these topics, but ongoing practice and reinforcement throughout medical training would be necessary for lasting skill development.
Frequently Asked Questions
Is ChatGPT good for learning about medical topics like childhood obesity?
A 2026 study of 359 medical students found that ChatGPT helped them create more complete and well-organized information about childhood obesity compared to Google Search, videos, or scientific papers. However, experts recommend using ChatGPT alongside other sources and teacher guidance to ensure information is accurate and applicable to real patients.
What’s the best way to teach medical students about obesity and nutrition?
Research shows that communication-based teaching—where students practice real counseling conversations—effectively helps medical students learn about obesity and sustainable nutrition. Combining this with access to multiple information sources (AI tools, search engines, videos, and papers) and explicit guidance on evaluating information quality produces the best learning outcomes.
Why do medical students need to learn about the Planetary Health Diet?
The Planetary Health Diet connects individual health with environmental sustainability, making it relevant for modern medicine. A 2026 study found that most medical students had limited knowledge of this approach before training, yet it’s increasingly important for doctors to understand how food choices affect both personal health and the environment.
Can AI tools replace traditional medical education?
A 2026 study of 359 medical students suggests that AI tools like ChatGPT can help organize and present information effectively, but they work best when combined with teacher guidance, critical evaluation of sources, and practice with realistic patient scenarios. AI tools should supplement, not replace, human teaching and critical thinking.
How much do medical students know about counseling families with obesity?
A 2026 study of 359 second-year medical students found that before taking a communication course, most reported little practical experience counseling children with obesity and their families, highlighting the need for early training in both nutrition knowledge and communication skills.
Want to Apply This Research?
- Track which information sources you use when learning about health topics and rate how helpful each source was (1-10 scale). Note whether you felt the information was well-organized, complete, and practical for real-world situations.
- When researching a health topic, use at least two different information sources (such as AI tools, search engines, and scientific articles) and compare what you learn. Discuss what you find with someone else to check if the information makes sense and is practical.
- Over one month, keep a log of health topics you research, which sources you used, and how confident you felt in the information afterward. Identify which combination of sources helped you feel most prepared to discuss the topic with others or apply it to real situations.
This research describes how medical students learned about childhood obesity in an educational setting and does not provide medical advice for treating obesity. If you or a child in your care is struggling with weight or nutrition, consult with a qualified healthcare provider who can assess individual circumstances and provide personalized recommendations. The findings about information sources reflect one study with 359 students and should not be interpreted as definitive guidance for all medical education or patient care. AI tools like ChatGPT can provide helpful information but should not replace professional medical advice or clinical judgment.
This research translation is published by Gram Research, the science division of Gram, an AI-powered nutrition tracking app.
