Health data sharing: scandals and solutions to improve health research data practices

Home > Health data sharing: scandals and solutions to improve health research data practices

29 Oct 2020 BlogEvent

Living with Data’s Itzelle Medina Perea has recently presented her work at this year’s Association of Internet Researcher’s (AoIR) 2020 virtual conference. Itzelle’s research reveals the complex “data frictions” experienced by UK university-based researchers as they navigate access to NHS patient data to conduct research on key health issues including antibiotics prescribing, dementia prevention, and cancer treatment.

Her paper presented at AoIR 2020 focuses on exploring perceptions of controversial uses of patient data and the impact that this has had in university-led health and medical research. It also pays attention to the role that patients and the public have in research projects that involve the uses of patient data in the academic context.

You can read more about the research in the article written by Itzelle below. You can also watch her 3 minute lightening talk presented at AoIR:

Health data sharing: scandals and solutions to improve health research data practices

In the UK healthcare sector, patient data is increasingly used in new ways that are beyond medical care. It is also now used outside the healthcare sector for the development of policy and research in academic institutions. This type of data has also been used for commercial and even for immigration enforcement purposes. These expanded uses can raise significant privacy and power issues.

A number of the data-sharing initiatives and projects between the UK National Health Service (NHS) and external parties have been particularly controversial in recent years. These include:

  • 2014- The Care.data initiative (now cancelled), which proposed to share patient data collected in primary care with organisations outside the healthcare sector without informing patients what data would be shared and to whom.
  • 2015- Health records of NHS patients held by the Royal Free London Trust were shared, without explicit consent from patients, with DeepMind, an artificial intelligence company owned by Google.
  • 2015- Personal data from NHS patients was shared with the Home Office to trace individuals tagged as ‘potential immigration offenders’ .
  • 2019- International pharmaceutical companies such as Bristol Myers and Eli Lilly have obtained access to NHS patient data.            

Partnerships between the NHS and external parties sometimes allow the sharing of patient data from the healthcare sector to technology and pharmaceutical companies. There are indications that these may not have taken patient privacy and agency into account. The opacity of these agreements and shortcomings in communications with patients as to how their data is being used increase concerns, sometimes only coming to light through journalistic investigations.  

Illustration of data journeys in health research data practices  © Itzelle Medina-Perea

The research aims

In the context of increasing concern about privacy and power around health data, my research involves an in-depth qualitative study of health “data journeys”. I am examining how data ‘flows’ from the NHS to various academic institutions to be reused for research purposes. This includes what enables these data flows and what are the ‘frictions’ that hinder or block data flowing from the NHS.

Using the ‘data journeys’ approach, I am able to illuminate the socio-material factors that shape the movement of data between different sites of data practice. By this I mean: 1) data sharing infrastructures and management; 2) socio-cultural factors, and regulatory frameworks For example from hospitals and general practices, through organisations such as NHS Digital and Clinical Practice Research Datalink, through to sites of data reuse such as universities.

I analyse how and why socio-material factors shape patient data flows within the healthcare sector, and what this means for patient data, changing its relation with other social actors. These insights will be used to develop recommendations for “just” practices in data sharing to make patient data flows within the health sector more clear and transparent for the public.

The research method

I have undertaken document analysis of policy, legislation and other relevant documents. I have also conducted 23 interviews with key stakeholders at different sites of data practice.

Headline findings:

  • University-based researchers consider the NHS patient data controversies have negatively affected the flow of patient data to academic institutions. They also hold the belief that public and patients’ concerns in relation to the use of patient data outside the healthcare sector have increased since the Care.data scandal became public knowledge.
  • Following the Care.data controversy, around 1.5 million people opted-out of their data being used outside of direct care, even after Care.data ceased. Interviewees state those opt-outs still affect health data flows, which was perceived as ‘tragic’:
    all the public confidence was lost and the system just frozen
    the data flow to research just stopped, data was not going anywhere [they] could not analyse anything, all fell to pieces
  • Researchers felt that they have had to gradually build back public trust around the use of patient data for purposes beyond direct care; however, the concerns around the multiple controversial NHS data sharing initiatives have undermined these efforts.

What’s the solution?

University-based researchers believe that transparency is key. They stated all initiatives that involve using patient data for secondary purposes should be clearer in their intentions, recognising people’s concerns when they do not know what is happening with their healthcare data.

It is important for university-based researchers and data practitioners to inform people about all the work they do with data. This includes: where it comes from, what happens to it, what research is aiming to do with these data, where is stored, what the safeguards are.

They have adopted strategies to engage patients and citizens and to build public trust and confidence. For example, one team participating in the study has set up citizens’ juries to discuss the acceptability of intended health data uses. Other teams have integrated patients or members of the public to their steering committees. 

  • Involving the public is key to legitimacy.
  • Reassuring patients their data is safe and being managed responsibly is an obligation.

Conclusion

University-based researchers and data practitioners believe that privacy concerns from patients and the general public have had a damaging effect on legitimate academic research. Thus, researchers have adopted strategies including citizens’ juries to listen to people and patient groups to give them a voice. Researchers hope this will restore confidence. However, these mechanisms do not guarantee a positive effect. There are risks associated with these initiatives: they can be tokenistic and non-representative of all whose data is at stake. Therefore, constant monitoring and evaluation of practices is also necessary. It could also be argued that while the involvement of the public and patients in projects could contribute to achieving more ‘just’ uses of patient data, further actions are needed. For example, it is essential to guarantee the existence of responsible data governance to avoid the uses of patient data that could lead to the exacerbation of already existing health inequalities and the systematic surveillance and privacy invasion of vulnerable groups (Dencik et al., 2019). Stronger legal protections, reviewed and updated to reflect changing practices, may be required.

References

Dencik, L., Hintz, A., & Cable, J. (2016). Towards data justice? The ambiguity of anti-surveillance resistance in political activism. Big Data & Society, 3(2), 2053951716679678. https://doi.org/10.1177/2053951716679678

Dencik, L., Hintz, A., Redden, J., & Treré, E. (2019). Exploring Data Justice: Conceptions, Applications and Directions. Information, Communication & Society, 22(7), 873–881. https://doi.org/10.1080/1369118X.2019.1606268

Harwich, E., & Laycock, K. (2018). Thinking on its own: AI in the NHS. Reform. http://www.reform.uk/publication/thinking-on-its-own-ai-in-the-nhs/

Health Information and Quality Authority. (2012). International Review of Secodary Use of Personal Health Information. Health Information and Quality Authority. https://www.hiqa.ie/system/files/Review-Secondary-Use-Health-Info.pdf

Iacobucci, G. (2018). NHS must stop sharing confidential patient data with Home Office, say MPs. BMJ, 360, k512. https://doi.org/10.1136/bmj.k512

Kitchin, R. (2014). The data revolution: Big data, open data, data infrastructures and their consequences. SAGE

NHS Choices. (2016, June 22). Your personal information. NHS Choices: Your Health Your Choices. https://www.nhs.uk/NHSEngland/thenhs/records/healthrecords/Pages/sharing-your-records.aspx

NHS Digital. (n.d.). What we collect. NHS Digital. http://content.digital.nhs.uk/article/4963/%20What%20we%20collect

NHS Digital. (2018). Why information is needed. http://content.digital.nhs.uk/article/3385/Why-information-is-needed

Powles, J., & Hodson, H. (2017). Google DeepMind and healthcare in an age of algorithms. Health and Technology, 7(4), 351–367. https://doi.org/10.1007/s12553-017-0179-1

Taylor, L. (2017). What is data justice? The case for connecting digital rights and freedoms globally. Big Data & Society, 4(2), 2053951717736335. https://doi.org/10.1177/2053951717736335