Chat GPT (7) Does ChatGPT provide reliable healthcare information? (3)

5 April, 2025

Following discussions on ChatGPT and my own questions about how the AI tool avoids misinformation, I decided to ask it myself. Here is the response (using the free version):

Prompt:

How does ChatGPT avoid misinformation, especially in the field of health?

ChatGPT said:

"ChatGPT avoids misinformation, particularly in sensitive fields like health, through several key strategies:

1. Training on Trusted Sources:

ChatGPT is trained on a large and diverse set of texts, including reputable sources such as books, peer-reviewed journals, medical textbooks, and other reliable databases. These sources are typically curated to ensure quality and accuracy, though ChatGPT doesn't directly access or browse the web for real-time information unless specifically enabled (e.g., for up-to-date medical research).

2. Content Filters and Guidelines:

The model is guided by content safety rules and ethical guidelines that prioritize accuracy and responsibility, especially when it comes to health information. It is designed to avoid providing specific medical advice, diagnoses, or treatment plans unless it's very general information based on established knowledge.

3. Generalization and Caution:

For complex or controversial health topics, ChatGPT generally provides information in a way that highlights uncertainty when needed. It tends to avoid presenting unverified claims as facts and offers balanced views based on the knowledge available at the time of training.

4. Focus on Educational Content:

When discussing health topics, ChatGPT aims to provide information for educational purposes rather than making specific recommendations. It can explain general health concepts, the importance of evidence-based practices, and direct users to professionals for more tailored advice.

5. Regular Updates and Improvements:

OpenAI works to update and improve the model to reduce biases and improve its responses, incorporating feedback and lessons learned from past conversations. If new tools or updated data sources are available, ChatGPT can access those to provide more accurate information (for example, accessing the web for real-time data in some plans).

6. Transparency and Limitations:

ChatGPT clearly indicates when it does not have access to real-time data and when it cannot give definitive advice on medical conditions. It encourages users to consult qualified health professionals for personal medical concerns.

7. Verification and Source Reliability:

While ChatGPT can provide information based on what it has been trained on, it does not independently verify or access the latest research or expert opinions in real-time. Users are encouraged to cross-check critical information with authoritative, up-to-date sources like peer-reviewed journals, official health organizations (like the WHO, CDC), and certified medical professionals.

In summary, while ChatGPT aims to provide accurate and responsible information, it always emphasizes the importance of consulting a medical professional for personal advice or diagnoses, especially in complex or serious health matters."

Best wishes

Julie

HIFA profile: Julie N Reza is a UK-based specialist in communications for biosciences, global health & international development (www.globalbiomedia.co.uk). She predominantly works with NGOs and not-for-profit organisations. Previously she was the senior science editor at TDR, based at the World Health Organization (WHO) in Geneva; prior to this she worked at the Wellcome Trust, UK, leading educational projects on international health topics including trypanosomiasis and trachoma. She has a PhD in immunology and a specialist degree in science communication. She also has several years research and postgraduate teaching experience. She is a member of the HIFA Steering Group and HIFA Social Media Working Group. https://www.hifa.org/support/members/julie-n She is coordinator of HIFA's LinkedIn channel: https://www.linkedin.com/company/healthcare-information-for-all-hifa/pos... Email: naimareza AT hotmail.com