BMJ news: 'Taking down online scientific misinformation isn’t necessary, as most people don’t believe it, says Royal Society' (2)

25 January, 2022

Note that the Royal Society report Neil mentions is the same one I summarized on HIFA a week ago ("Royal Society issues report on "The online information environment""). [ ] Since this seems to have slipped through the radar, I include the main points at the end of this message.

I would in addition like to address Neil’s comment on health literacy. He wrote: “I would caution that although health literacy is important, we should not rely on building personal health literacy as a solution to online scientific misinformation, any more than removal or mitigation. The key issue is about the ability to differentiate reliable information from misinformation.”

This is a puzzling distinction. Surely health literacy is exactly about developing “the ability to differentiate reliable information from misinformation”? Someone with a high level of health literacy will certainly have a better chance of differentiating between the two – although it must be said that even the most health-literate people do occasionally make mistakes in this area.

And as to Dr Okan’s quote (“When we think of #healthliteracy we should think not only of #personalhealthliteracy but also: signposting, standards, regulation, surveillance. policy, #organizationalhealthliteracy and #personalhealthliteracy. A 'comprehensive approach' is needed.”) - which Neil cites with approval (and with which I entirelyy agree), doesn't this amplify the stress on the value of health literacy? I don’t think Dr Okan (or the Royal Society) would agree that “we should not rely on building personal health literacy as a solution to online scientific misinformation”!

What do HIFA readers think?



HIFA profile: Chris Zielinski: As a Visiting Fellow in the Centre for Global Health, Chris leads the Partnerships in Health Information (Phi) programme at the University of Winchester. Formerly an NGO, Phi supports knowledge development and brokers healthcare information exchanges of all kinds. Chris has held senior positions in publishing and knowledge management with WHO in Brazzaville, Geneva, Cairo and New Delhi, with FAO in Rome, ILO in Geneva, and UNIDO in Vienna. Chris also spent three years in London as Chief Executive of the Authors Licensing and Collecting Society. He was the founder of the ExtraMED project (Third World biomedical journals on CD-ROM), and managed the Gates Foundation-supported Health Information Resource Centres project. He served on WHO’s Ethical Review Committee, and was an originator of the African Health Observatory. Chris has been a director of the World Association of Medical Editors, UK Copyright Licensing Agency, Educational Recording Agency, and International Association of Audiovisual Writers and Directors. He has served on the boards of several NGOs and ethics groupings (information and computer ethics and bioethics). UK-based, he is also building houses in Zambia. chris AT

His publications are at and and his blogs are and

Sent to HIFA on 20/1:

The UK Royal Society has just brought out its report on “The online information environment: Understanding how the internet shapes people’s engagement with scientific information” ( ).

Alongside the publication of this report, the Society is launching a blog series of weekly perspective pieces ( offering personal takes from leading figures on specific aspects of this topic, from potential regulatory approaches to what the media is doing to combat fake news to the role of knowledge institutions.

The report is a testimony to the Royal Society’s engagement with a topic we have often discussed on HIFA. It concludes that we need to focus on building resilience against harmful misinformation across the population and the promotion of a “healthy” online information environment. As the road towards this ambition, it makes the following specific recommendations:

1. As part of its online harms strategy, the UK Government must combat misinformation which risk societal harm as well as personalised harm, especially when it comes to a healthy environment for scientific communication.

2. Governments and social media platforms should not rely on content removal as a solution to onlinescientific misinformation.

3. To support the UK’s nascent fact-checking sector, programmes which foster independence and financial sustainability are necessary. To help address complex scientific misinformation content and ‘information deserts’, fact checkers could highlight areas of growing scepticism or dispute, for deeper consideration by organisations with strong records in carrying out evidence reviews, such as the UK’s national academies and learned societies.

4. Ofcom must consider interventions for countering misinformation beyond high-risk, high-reach social media platforms.

5. Online platforms and scientific authorities should consider designing interventions for countering misinformation on private messaging platforms.

6. Social media platforms should establish ways to allow independent researchers access to data in a privacy compliant and secure manner.

7. Focusing solely on the needs of current online platforms risks a repetition of existing problems, as new, underprepared, platforms emerge and gain popularity. To promote standards and guide startups, interested parties need to collaborate to develop examples of best practice for countering misinformation as well as datasets, tools, software libraries, and standardised benchmarks.

8. Governments and online platforms should implement policies that support healthy and sustainable media plurality.

9. The UK Government should invest in lifelong, nationwide, information literacy initiatives.

10. Academic journals and institutions should continue to work together to enable open access publishing of academic research.

11. The frameworks governing electronic legal deposit should be reviewed and reformed to allow better access to archived digital content.

Chris Zielinski

Blogs: and

Research publications: