Oxford professor calls for European ethical codes on patient data

Floridi: ‘The patient has to be informed and willing to share the information that researchers are collecting.’ Photograph: David Sillitoe for the Guardian

Prof Luciano Floridi proposes codes for the reuse of medical data and data donation

The principle of patient confidentiality has centuries-old cultural roots.

Hippocrates, the father of medical ethics, counselled physicians in 400BC not to divulge what they “see or hear, in the life of men, which ought not to be spoken of abroad … as reckoning that all such should be kept secret”.

But should a patient’s data be any different to their blood, bone marrow or organs – material that can be donated freely for the benefit of others or the advance of science?

Taken in isolation, one patient’s medical record distils their life to numbers, symptoms, diagnoses and treatments.

But combine that data with other people’s records and another opportunity emerges. There’s a chance to see patterns, correlations and exceptions – a seed of innovation in our understanding of diseases and, potentially, how to treat them.
Prof Luciano Floridi, director of research at Oxford University’s Internet Institute believes the time has come for new European ethical codes to govern “data donation” and its use for medical research.

He says debate in Europe over individual privacy versus societal benefits of shared data has been “swinging like a pendulum between two extremes”. Medical research with big data should be part of the future of Europe, according to Floridi, “not something we need to export to other countries because it is not do-able here”.

“The patient has to be informed and willing to share the information that researchers are collecting – for the benefit of the patient and anyone else affected by the same problems,” said Floridi, who is also chair of the Ethics Advisory Board of the European Medical Information Framework, the largest EU project on the unification of biomedical databases.

“This consent is foundational and shouldn’t be underestimated in any way.

“At the same time, we are collecting immense quantities of data, of huge variety, and at an amazing speed. Sometimes we do not know what these data will be useful for.

“This is the so-called re-purposable nature of data. You collect data about Alzheimer’s, for example, and suddenly find that it is useful for other diseases. This tension between what you can give consent about is the current debate at the European level.”

Floridi, who has advised Google on the ethics of information and the right to be forgotten, proposes the creation of two new ethical codes. The first would govern the use and re-use of biomedical data in Europe – an ethical code from the practitioners’ perspective. The second would relate to “data donation” and the informed choice of an individual to share personal information for research.

“It is incredible that these days you can donate blood or your organs – it does not get any more personal – but you cannot donate your data as easily,” Floridi says.

However, some have already made the choice between privacy and openness.

Ten thousand people offered to share their medical data within two weeks of the launch of the UK Personal Genome Project (PGP-UK) in November 2013.

PGP-UK was the first UK research project to work on the basis of “open consent” – meaning that medical information attached to a person’s record would be freely available online.

So far, 10 genomes have been received, a further 50 genomes are now being collected, and 50 more are at the final administrative phase before participation.

But the parent PGP project – founded in the US in 2005 by George Church, professor of genetics at Harvard Medical School – remains a debating point about the open data versus individual privacy.

One US academic used the example of the PGP to argue earlier this year that informed consent by participants needed to be matched by usage agreements by anyone accessing the data.

Prof Sharona Hoffman, co-director of the Case Western Reserve’s Law-Medicine Center, proposed that the agreements would state how “the data will be used, who will see the data, and how it will be discarded after use as well as to commit to safeguarding data security and refraining from attempts to re-identify data.”

Has open genomic data been misused in practice?

Pre-emptive legislation by US Congress in 2008, the Genetic Information Non-discrimination Act, outlawed the use of genetic information by health insurers and employers to discriminate against individuals in the US.

“In the 10 years the PGP project has been running, there is no case that I can recall where open access genetic information has been misused,” said Stephan Beck, director of PGP-UK and professor of medical genomics at the University College London Cancer Institute.

“There are, of course, risks attached to it being open.

“But this information is no different to people putting a picture of themselves on a website which shows that they might be overweight or smoking. That might be more telling about your health than the genome data might tell you.

“There is a lot of personal information that is openly disclosed which we are not fearful of disclosing. But we seem to be extremely fearful of disclosing any genetic information.”

Beck argues that managed access data, which requires researchers to apply for permission to access a database, had two common problems.

First, it takes time to secure permission; second, managed access databases are often not curated as rigorously as open data. “It’s reached the point where a lot of data is being generated – useful, really exciting data – that is lost to research or, at least, extremely difficult to access,” said Beck. “Many people in the field think the balance between open and managed access needs to be looked at again.”

PGP participants are enrolled after passing an entrance “exam” which tests their understanding of privacy issues. Once accepted, participants have the right to delete their profile at any time and report quarterly on any negative consequences.

Annually, the project is reviewed by the UCL Research Ethics Committee in the UK and its counterpart Institutional Review Board in the US. But the central dilemma in any attempt to trade-off risk and reward is the calculation: is it worth it?

Is it a likelihood or a leap of faith that big data for medical research will yield innovative diagnosis or treatments in the future? Similarly, how much of an individual’s privacy is in jeopardy by sharing medical data?


(This story first appeared on The Guardian)