Meeting on the revision of the treaty of the declaration of Taipei developed by WHO EPI-WIN.

6 December, 2025

[Note from HIFA moderator (Neil): Thank you Richard. This was submitted on 4th but I regret I did not see/approve it until today (6th). Thank you also for your detailed notes. I encourage others to share notes of meetings they attend as and when they are relevant to our remit.]

Live Streaming of the first day of the first open expert meeting on the revision of the treaty of the declaration of Taipei. 04/12/2025. This conference was developed by WHO EPI-WIN. I don't know whether this is of interest. The second day will be on December 5th.

The declaration of Taipei requires the ethical use of personal health data and biomaterial and was first forged in 2016. The declaration requires enhancement in response to the developments of Artificial Intelligence and medical human rights. In 1998 the government of Iceland made it legal for a private company to construct an electronic database of the country’s health records and for the company to combine and analyse these with genetic and genealogical data, both retroactively and proactively. The Icelandic Medical Association took their concern about this issue to the Nordic Medical Association in 1998, and to Chile in 1999. A statement of the WMA in Washington in 2002 covered the processing of personal health data but not biomaterials.

The WMA General assembly made a declaration on the issue in Fortazella, Brazil, in 2013, and this was adopted by the WMA in Taipei in 2016. WMA Declaration of Helsinki – Ethical Principles for Medical Research Involving Human Participants – WMA – The World Medical Association https://www.wma.net/policies-post/wma-declaration-of-helsinki/ The declaration received some attention but not as much as was expected. The declaration was aligned to the declaration of Helsinki reinforcing the ethical need to maintain patients’ dignity and autonomy when collecting, sorting, or using donated personal health data or biomaterials. It has become apparent In more recent years that the identity of the donor of almost all “anonymised” data and biomaterials can be extracted from the anonymised materials. Data protection authorities and research ethics committees oversee the “secondary use” of data and the data can often be used indefinitely and for multiple purposes.

Generally broad descriptions of research areas such as heart disease, cancer, infectious diseases, are used for obtaining consent at the

beginning of data access for research, and a dynamic consent process has so far been unmanageable. Children are sent an invitation to opt out of their data or biomaterials being shared when they are 16, 18, or21 according to the laws of the country in which they live.

The WMA’s ethical declaration is recognized by, and of use to commercial organizations. Data gathered for research from social media is not yet regulated. AI and big data have high volume, high velocity, high variety, and issues of veracity, introducing lack of explicability and regulatory challenges which might benefit from the early introduction of ethical guidelines. “Forum shopping” seeks data research in countries with lower levels of legislation, and duplications of research may be defined geographically.

Donation of biomaterials or of personal health data are transfers of ownership with a variable balance of benefits and profits, to the donors, the state, and commercial data users. AI companies and designers are developing intellectual property rights and resisting the regulation of AI during the early years of its development. Commercial companies publish some results of research on donated biomaterials, whole genome sequencing, or personal health data, but do not wish to disclose information which affects their market position as the market pays for much of their research. Commercial repositories are of a very large size. A “person” owns the data; the results of analysis of the data are owned by the researcher.

Policies about incidental patient findings (return of results) vary, but affect the partnership of trust between the patient and personal data or biomaterial collector. There are 140 diseases of genetics whose course can be affected by intervention. The BRCA gene offers a useful and illustrative example of return of results. The discovery of the BRCA gene in patients predicts a 45% to 80% chance of patients developing breast cancer. 70% of donors want his test to be done and 30% do not. A dynamic form of consent is required to support patients’ dignity and autonomy. Both an ethical committee and a mechanism for patient autonomy are necessary to manage the returns of results.

Governance of the use of the materials or personal data requires patient and public involvement, ethical research committees, researchers, institutions, and custodians. Patients do not own the data or materials. Methods of managing old materials, closure of a data bank, or of materials of patients who have died, are still developing as data and biomaterial research progresses. Storage of biomaterials is expensive. Continued public and patient involvement is necessary for the implementation of research on donated data and materials and stockholders have an influence.

A young intern at the WMA HQ used Google’s Notebook LM to examine the wording, science, soft law, and legal structure of fifty ethical data sharing documents from 2000 onwards. Most documents were published after 2015. Ethical problems that were discovered included commercial interests overriding public health goals, commerce reshaping research agendas, reduced public trust, lack of protection for vulnerable groups, and managing incidental findings (return of results). Problems with return of results depended on the nature of the prompts to return results, and deciding which subjects were most beneficial and relevant for return.

There are different national timelines for the development of national data and biomaterial data banks. Patients’ data and biomaterial donation contracts may be thirty pages long and are of little practical use to patients. South East Asian populations are not well represented and we were told that many of 110 communities do not trust their governments. We heard that Different cultures have different sensitivities, senses of shame, and stigmas about personal health data. A different method of obtaining consent would be beneficial. (Ingredients have to be fresh and they have to be local”.)

An ongoing dynamic consent process might be useful for some research projects, and development of a different method of obtaining consent would be beneficial. This might be better at explaining contractual arrangements, safeguards, and conditions.

A member of Amazon Web services (AWS) explained that AWS is the leading global digital health service supplier. They have developed data governance with trusted research environments. UK biobank will be 300 petabytes by 2030. The time taken to analyse and interpret data from data banks is reduced by 99% by AI. The customer that commissions the use of AWS services is responsible for the governance of the personal health data. The AWS service is responsible “for the plumbing”. Between them they balance fairness, explainability, robustness, privacy, security, governance, and transparency.

There are three national AI centres in Taiwan, and five clinical research centres. One centre is for responsible AI in healthcare. One performs external validation of health and one performs clinical AI impact and health technology assessments to decide whether the AI systems improve patient outcomes. Periodic testing of the AI solutions is necessary as the data on which the AI is applied changes with time. The health AI commissioners need proof of utility of AI solutions as much as of accuracy.

An AI solution which has no positive effect on health outcomes not suitable for funding by the state. All AI services must be supported by pre-implementation research and the AI centres are creating federated AI development platforms on which developers and users may share experiences

and learn together.

Taiwan utilises AI data sheets to explain and accompany AI solutions that are similar to drug and surgical intervention data sheets. The AI health solutions aim to have popout information on each screen to remind users of the nine AI designs and functions. The EU AI Act is the European Union’s landmark law to regulate artificial intelligence, and the world’s first comprehensive AI legislation. It creates a legal framework to address AI’s risks while encouraging innovation. All AI settings should cause no harm, and respect autonomy.

On Tuesday the Taiwan Legislative Yuan passed a law that regulates the use of National Health Insurance (NHI) data. The law grants individuals the right to opt out of having their records used for nonmedical purposes, imposing fines for unauthorized access.

The law, which establishes a framework for the management of NHI data, aims to strengthen privacy protections while enabling continued use of the data to support healthcare quality, public health, social welfare, academic research and government operations.

The Taiwanese NHI database includes insurance and medical records of approximately 99.9 percent of insured individuals. It may be accessed by government agencies or research institutions, provided that the data that is analysed is anonymized.

Author: 
Richard Fitton