August 6, 2018 at 05:07AM
Go to the source
CANBERRA — In Australia, a three-month period has begun to opt out of a government-run electronic health record scheme, allowing all adults to decide whether they want data collected and stored on them and their children’s engagement with medical services.
The benefit, according to the government, is to allow health records to travel between practitioners for better treatment. But despite its claim that the systems built are “military grade,” many Australians are not having it — the first day of the opt-out period saw online and phone systems struggle as Australians chose not to risk potential breaches of privacy.
The consent to collect, maintain, and use personal data is an important part of this debate, and Australians are not the only ones up in arms about it. The concern resulted in the European Union introducing new data protection legislation in May, of which affirmative consent is a key part — “silence is no consent,” the guidance reads.
The Data Guardians series
The collection of data on digital platforms has become ever more central to aid work, as organizations strive to ensure their interventions are as efficient, effective, and targeted as possible. But while data can be a transformative tool, it also comes with risks. As scandals over data protection push these concerns further up the agenda, Devex’s Data Guardians series explores the issues affecting aid organizations as they work to protect their beneficiaries’ data, and the debates and practicalities around what more can be done.
But among the most vulnerable populations in the world, the idea of consent can be almost nonexistent when it comes to data collection. With a growing need for humanitarian and development support for record numbers of displaced persons, the decision is being made for this vulnerable population that data and digital systems are the best means to monitor and support them.
“Consent plays no role because [currently] there is no meaningful choice not to participate,” Linnet Taylor, assistant professor in data ethics, law and policy at Tilburg University, explained. “Consent is not consent unless it is freely given. By definition, if people’s access to essential services and to human security is dependent on allowing collection of their data, they are not giving consent.”
While some aid experts say the delivery of better and more efficient services needs to be prioritized above data rights, especially in humanitarian contexts, others argue the issue is creating a growing human rights divide between citizens of higher- and lower-income nations.
Data and digital in the humanitarian context
A rising number of projects are using digital data collection to better support displaced persons and refugees. In Bangladesh, for example, the United Nations High Commissioner for Refugees is collecting digital data on families, including their physical location and photographs, to improve the delivery of services. Late last year, an estimated 105,000 households were surveyed, covering 525,000 people, with UNHCR describing the data as “vital” for distributing assistance.
“If you don’t know who people are, their family, how many they are, and where they are, it is not possible to [provide services] in an organized, efficient way,” Laura Giammarinaro, a senior UNHCR registration officer, explained.
Alongside the World Food Programme, UNHCR has also been experimenting with the use of biometric scans to make food payments in Jordan. It is collaborating more widely on data collection and digital solutions, with the ID2020 alliance providing a coordinated effort between government, NGOs, and the private sector to create a legal digital identity for individuals. This would in turn support their access to services, education, employment, and land.
But the information collected on displaced persons — including ethnicity, religion, sex, and other demographic information — can also put them at risk, making “affirmative consent” a key concept.
With ID2020, the data is in the hands of the user, who provides permission on who can gain access. But the data is still stored and maintained in an environment that could never guarantee total security, according to experts. Coupled with the fact that a large number of organizations and jurisdictions will potentially be accessing the data to provide services, the risk of breaches grows.
Globally, the value of data is well understood — by private companies, government, and hackers who can collect and sell personal information. Health data, in particular, is increasingly being targeted.
According to research by Caribou Digital, refugees are aware that their data is unlikely to be secure. Recent concerns surrounding the RedRose digital payment system is one such example in the aid space.
For many, there is often no choice over whether they do or don’t supply personal information, and the dangers of data breaches for this vulnerable group can be of the highest consequence.
The debate surrounding consent
The debate on consent to collect and use personal data is not new — it is a topic that has been discussed for decades, with concerns from 20 years ago not significantly different from those discussed today.
At a 2001 workshop on research ethics in complex humanitarian emergencies, participants were asked to discuss what “kind of consent is necessary and feasible for different types of emergencies, at different stages and for different types of data collection or research.”
Richard Black, head of the college of social sciences at the University of Birmingham, was at the meeting and argued that the complexity of gaining consent in these scenarios should not mean it is overlooked.
Today, he is arguing the same point. “It is still an issue,” he told Devex. “Since 2002, capacity to collect data has increased dramatically and now encompasses biometric data as well as remote data, collected [for example by] using drones. This data is not always collected with consent.”
“There is no in principle reason why refugees and displaced persons should be treated differently in matters of consent than any other human beings.”
Get development’s most important headlines in your inbox every day.
Thanks for subscribing!
But Black believes that, to date, the discussion has been sidelined as being too hard to effectively respond to in the humanitarian context.
“Stressing ‘exceptionalism’ cannot be used to justify poor practice,” he added.
The issue was highlighted earlier this year as part of the Data for Development Festival held in Bristol, United Kingdom, which saw sides debate the pros and cons of data privacy in the humanitarian and development context.
Some participants suggested that privacy concerns are obstructing the sector’s ability to save lives. For example, Sudha Balakrishnan, a health specialist with UNICEF working on issues such as HIV, argued that the most vulnerable can fall through the cracks of traditional data collection methods, and that by not being able to gain access to personal and unit level data on those at risk, lives are being lost. Natalia Baal, coordinator of the Joint IDP Profiling Service, agreed, arguing that fears of data falling into the wrong hands is sometimes used as an excuse not to collect it.
But others pointed out that failing to heed privacy concerns can itself put lives at risk. Charlie Harrison, technical lead for big data and social good with GSMA, argued that the aid sector — which deliberately collects data on people who are vulnerable and marginalized — needs to take extra care. He cited an example of a charity in the U.K. that collated data on London’s homeless population. That data was later accessed by the government through freedom of information laws and used to target people for deportation.
A landscape of patchy privacy regulations across the African continent is leaving NGOs vulnerable to government attacks, and inhibiting international groups from partnering with local organizations.
And the chair of Sierra Leone’s open data council, Yeama Thompson, emphasized the importance of giving people in lower-income countries choice about how their information is collected and used.
“Especially from our experience in Africa and Sierra Leone … empowering people to have control over their possible data, I see that as very, very key,” she said, adding that it is an important step in providing data education, as well as enabling people to determine their future.
Reducing the risks
There are ways of reducing the risk to individuals. Data collection can be minimized to that which is critical for example. For Black, there is currently a range of data being collected that can rarely be justified.
“For example, is it really necessary to retain biometric data on recipients of humanitarian assistance?” he asked “This is sometimes justified on the grounds that otherwise people will claim more than once. A measure of reasonableness would be whether governments require the same data to be collected when distributing social services.”
Harrison suggested that working with aggregated data can generate insights for policies and programs without the need to keep personal data. In the context of a dataset on vulnerable youths living with HIV, for example, this would mean insights to support the response could be presented across time, geography or demography, rather than having to maintain a database with personal information which could itself put lives at risk.
But it is critical that data collection is combined with clear and informed consent, with the risks and vulnerabilities explained, according to Brandie Nonnecke, research and development manager at the CITRUS and Banatao Institute.
“If a practitioner is establishing a biometric-based digital ID, especially a digital ID that is tied to other data points such as age, health information, and address, it is imperative to clearly define what data are being collected,” she said. This includes how data will be stored and used, the risk of breaches, and what cyber security safeguards are in place.
“This information should be presented to individuals in their local language in a way that is understandable for all education levels,” Nonnecke said, with that information affecting their decision on whether or not to enrol.
At issue in this debate is the right to choose. For Taylor, the balance of providing services versus the need to protect privacy has today rendered this right nonexistent in the humanitarian space.
“Currently, there is no balance,” she said. “It is assumed that refugees and displaced persons consent to all uses of their data. They are being used as a test case for just about everything, from biometric databases and new forms of tracking to the use of blockchain for processing population records.”
For Nonnecke, the notion that access to services requires the sharing of personal data needs to be removed, with the priority placed on the “safety and well-being of displaced persons and refugees above the increased efficiency gains they would receive from use of digital IDs.”
“It is imperative that goods and services not be withheld if an individual does not want to enroll in a digital ID program,” she said. “It should be clearly communicated that enrolment is voluntary.”
As this would lead to increased complexity in responses, however, Taylor worried it may be too late to reverse the trend.