The role of data protection in the digital economy
Last reviewed: June 2022. This resource is available for download (PDF) in English and French.
Governments, organizations and individuals increasingly generate, collect and process personal data. A strong data protection framework helps foster consumer trust and increased use of digital tools, which in turn can incentivize investment, competition and innovation in the digital economy.
The brief, written in close collaboration with Macmillan Keck, seeks to identify specific attributes of a data protection framework that can help policymakers and regulators build a digital economy that includes — and serves — everyone.
+ 1. Summary
Governments, organizations and individuals increasingly generate, collect and process personal data.
Data protection seeks to balance the benefits and the risks of personal data processing¹ so that individuals have confidence that their data is collected and stored safely and used solely for legitimate purposes.
Data protection laws typically require personal data processing to be lawful, limited, transparent, accurate and secure. They often seek to protect individuals’ privacy² and grant some control over how personal data about them is processed. They also typically establish institutions with powers to conduct investigations and enforce obligations.
A strong data protection framework provides certainty which may encourage investment, competition and innovation in the digital economy and uptake of digital government and private sector services.
+ 2. Considerations while reading this brief
- Which challenges related to data protection and the digital economy are most prominent in your market, both in general and for marginalized groups such as women and low-income people?
- Do data protection regulations in your country address:
- Digitization: The application of data protection regulation to the digital economy?
- Inclusivity: The specific data protection challenges faced by women, low-income people, and/or other marginalized groups?
- Which entities are responsible for regulation of data protection? Are responsibilities clear, and are mechanisms in place to avoid regulatory arbitrage? If not, how could this be improved?
+ 3. Why we need data protection
Data for development
Digital technologies and data are potential enablers of development in health, education, agriculture, food security, financial services, manufacturing, trade and infrastructure, and the digital economy itself. They can transform public and private services, inform policy decisions, and improve the monitoring of progress and impact.
For example:
An online e-participation platform in Morocco allows citizens to submit and vote on ideas and provide feedback on proposed legislation to improve public services;
Digital financial service providers analyze data about potential customers to market digital payment services to them, profile their risk levels for credit, manage identity and detect suspicious transactions;
Digital identification systems collect and exchange personal data to authenticate people, reducing fraud and barriers to accessing services;
Data about individuals’ use of financial services worldwide is distilled to produce the Global Findex, which enables countries and other stakeholders such as to track progress and develop policy towards; and
Gender-disaggregated data is an essential component to bridging the financial inclusion gender gap.
While more effective collection, organization, analysis, storage, and transfer of data (the lifecycle that comprises data processing) may improve its productive use, measures should be taken to ensure consumer data protection and privacy. Governments, organizations, and individuals increasingly generate, collect, and depend on data about people. 64.2 zettabytes (or 64.2 trillion gigabytes) of data were created or replicated globally in 2020 alone, and it is estimated that this amount will grow at a compound annual growth rate of 23 percent through. Much of this data is to be personal data, meaning it relates to or can be used to identify individual persons, referred to as data subjects.
Risk and trust
The generation and processing of vast amounts of personal data involves risks. Personal data can be lost, stolen, disclosed without consent, or misused. This can result in identity theft, unwanted or embarrassing disclosures, loss of important information, or unwelcome marketing or solicitation. Personal data can also be used for government or corporate surveillance, as well as discriminatory treatment of vulnerable individuals and communities.
Individuals may be unaware of how data about them may be used or to which entities such data may be transferred, and their trust should not be taken for granted. As individuals become more aware of risks relating to their personal data, they may avoid or limit using digital services, potentially impeding efforts for economic development and inclusion.
Recent studies show that in in both higher- and lower-income countries consumers value protection of their personal data. A majority of low-income customers in Kenya were willing to pay a premium for greater protection of their personal data in digital loan services, and customers in India were likely to decline remittance discounts offered in return for sharing personal data. Similarly, a global survey of more than 5,000 consumers found that one in ten “expected their overall engagement with technology to decrease in the next six months” due to concerns over data breaches and privacy.
Women can have different data privacy concerns and be more privacy conscious as a result of their vulnerability to reputational harm. Recent research suggest women’s concern parallel the challenges and threats they encounter in their physical lives, such as location tracking and sexual harassment. Relatedly, an important deterrent for women from using DFS is the fact that they have to share personal information, such as mobile numbers with agents who might misuse it. Concerns about their data and security can lead women to curtail their use of different services and self-censored their behavior. Women might also lack knowledge of how to safeguard their personal data and rely on male family members and more educated people for advice on how to protect their photos, social media messages, etc. Policymakers must take into account these concerns unique to women when drafting a data protection and privacy framework.
International trends
Data protection is increasingly mandated in national laws and regional laws and agreements across higher and lower income countries. As of April 2020, 66 percent of countries had adopted data protection and privacy legislation. A widely cited example is Europe’s General Data Protection Regulation 2016 (GDPR). Such laws typically seek to balance the benefits and the risks of personal data processing so that individuals have confidence that personal data relating to them are collected and stored safely and used solely for legitimate purposes.
+ 4. Who must comply with data protection
Data protection frameworks designed in the GDPR tradition impose obligations on two principal actors:
Controllers are those persons or entities that determine the purpose of and means for processing personal information. For example, a bank collecting personal information about account holders would be a controller.
Processors are those persons or entities that carry out processing of personal data at the direction of or on behalf of a controller. For example, the entity that operates the software that the bank uses to access and store its records would be the processor.
The European Commission’s examples of controllers and processors provide some further context for this distinction.
It is worth noting that controllers may process their own data, but processors are always acting on behalf of a controller.
+ 5. The key elements of data protection
Lawfulness of processing
Data protection frameworks typically require that processing of personal data be carried out lawfully, meaning that the basis for processing is expressly authorized by law.
The consent of the data subject is frequently relied upon as a lawful basis. Consent should be voluntary, freely given, and evidenced by an affirmative action of the data subject, so it should not be enough to give the data subject pre-checked boxes or default settings. Consent is also typically considered narrowly. For example, consent given for collection and storage of personal medical records cannot be viewed as consent to generation and receipt of unrelated marketing emails.
Consent has some weaknesses as a means of legitimizing data processing. Individuals cannot realistically read all of the disclosures made by controllers, and they may not understand the implications of consenting to personal data processing. People may also give consent because the only alternative is to forego the service, meaning they may not have any real choice. Nevertheless, consent remains the only basis for processing in which the data subject has some control over processing of his or her personal data, and efforts are being made to enable consent to be more meaningful, so it remains an important feature of a data protection framework.
Another commonly used lawful basis for processing personal data is the legitimate interests of the controller or third party. This is the most flexible of the lawful bases and can apply to virtually any type of processing for any reasonable purpose. However, it requires the controller to weigh these interests against the interests and fundamental rights and freedoms of the data subject using a three-part test:
- Purpose: Is there a legitimate interest behind the processing
- Necessity: If so, is the proposed processing necessary for that purpose?
- Balancing: Is this legitimate interest overridden by the data subject’s interests or fundamental rights and freedoms?
Other lawful bases for processing on which controllers rely include (among others) when processing is:
Necessary to perform a contract with the data subject (for example, to provide a product suited to his or her needs);
Necessary to satisfy a legal obligation of the controller (for example, if a telecommunications company is required to keep records of customers’ services for billing purposes);
Necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller (for example, the administration of justice); or
Essential to the life of the data subject or a third party (for example, to inform the doctor of a medical condition in an emergency).
Often processing is lawful because it is expressly authorized in a particular law separate from the data protection framework, such as the collection of personal data by a national ID system under a national ID law.
Data minimisation
The theme of data minimisation runs through many elements of data protection. It involves minimizing the processing of personal data to only that which is necessary in relation to the purposes for which they are processed.
A data protection framework will typically require that any collection of personal data be carried out for a specific and express purpose which must be “lawful” or “legitimate.” This purpose specification requirement limits further processing of data beyond this specified purpose. This limitation guards against “function creep,” where personal data originally collected for one purpose is used for other purposes. For example, a superstore with a pharmacy should not use data about the prescription medicines of customers to market unrelated sporting goods products to those customers.
Once a purpose has been specified, many frameworks require that processing of personal data be limited only to what is necessary to achieve the specified purpose, sometimes referred to as the proportionality principle. However, some frameworks do not go quite this far, requiring only that processing not be “excessive” or merely that it be “relevant” to the specified purpose. For example, an employer might need to retain detailed medical information on employees engaged in hazardous factory work in case of an accident, but might not require such data from its administrative staff in an office situated elsewhere.
Data retention limitations also minimize processing by requiring controllers and processors to retain personal data only for as long as is needed to fulfill the specified purpose. This reduces risks of data breaches and unauthorized onwards sharing that arise from unnecessary storage. For example, the EU requires that banks store customer data for five years and allows member states to extend this to up to 10 years. By contrast, an employment agency should not retain CVs of persons seeking employment for decades, as this is not proportionate to the purpose of finding employment for these persons in the short and medium term.
Some frameworks require that data processing systems incorporate privacy by design or privacy by default. These terms refer to the implementation of administrative and technical measures that apply data minimization principles in the architecture and processes of the data system. For example, a hospital’s patient records can be pseudonymized when stored or otherwise processed in order to reduce risk of disclosure in case of a data breach.
Transparency
Data protection frameworks typically require that data processing be fair and transparent, requiring mandatory disclosures to data subjects when data about them is collected, regardless of the legal basis for such collection and processing. Some frameworks require such disclosures to be made to the data subject even if the personal data is obtained from a third party or from publicly available sources. These must typically disclose the identity of the controller, the purpose of collecting the personal data, any third parties to whom it may be disclosed, and individual rights available to the data subject. Fair and transparent notification requirements are closely related to consumer protection, as discussed in our brief on Consumer Protection.
Data quality
Data protection frameworks also typically require that controllers actively maintain the quality of the personal data they are processing. This may create an affirmative obligation to ensure that personal data is and remains accurate, complete, and up-to-date.
Direct marketing
Data protection frameworks often incorporate limitations on direct marketing activities targeting data subjects by controllers. Some frameworks prohibit direct marketing activities unless a data subject has expressly opted in, with some exceptions for existing customer relationships. Others only provide that a data subject may object or opt out.
Data security
Data protection frameworks typically require controllers and processors to assess and maintain security in their data systems, including disclosing data breaches to the data protection authority and in some situations to the relevant data subjects.
Cross-border data flows
Efficient and innovative use of data may involve transferring data across country borders. This may be necessary, for example, for provision of cross-border digital services and electronic commerce, operation of international supply chains, customer relations management by international service providers, access to better or lower cost data processing, or pooling of data for better analytics.
On the other hand, it is difficult to monitor and enforce data protection requirements if data leaves the country. Many data protection frameworks therefore impose conditions and restrictions on transfer of data outside the jurisdiction.
Sensitive personal data
Some data is viewed as sensitive personal data, such as personal attributes about an individual’s body and behaviors (biometrics, health status, sexuality), lineage (race, ethnicity), or spiritual beliefs, philosophy and opinions (religion, political beliefs). These data are typically given enhanced protections because they may be embarrassing or uncomfortable to the data subject if disclosed, or risk being used for undesirable profiling or discriminatory treatment adverse to members of a potentially vulnerable group. The enhanced protections typically involve heightened requirements to obtain the data subject’s consent to data processing and tighter restrictions on transfer of such data abroad.
Individual rights
In an increasing number of jurisdictions, the principles of data protection are not merely reflected in obligations of controllers and processors but are implemented in enforceable rights of data subjects. These rights give data subjects a degree of control over how personal data about them is processed and are generally supposed to be exercisable at no or nominal cost. These rights are similar to other rights afforded to consumers generally under consumer protection frameworks.
Individual rights usually include a right to verify whether one’s personal data is being processed by a controller and, relatedly, a right to access and review a copy of that personal data. Individuals may then have a right to rectify any out-of-date, misleading and incomplete personal data they identify. The right to verify, review, and rectify personal data an organization holds about a person would be important, for example, where data is used to assess eligibility for a loan and incorrect data might harm his or her prospects.
Under some frameworks, data subjects are granted a right to deletion of personal data held by a controller. When provided, this right typically can only be exercised when personal data was obtained unlawfully, the controller no longer has a valid basis to retain the data, or retention of the data is no longer necessary (i.e., the right implements the principles on lawful basis of processing and data minimization discussed above). For example, a utility company may require the residential address of a subscriber, but there may no longer be a legal basis to retain that personal data once the subscriber deactivates the service. In that case, the subscriber would be justified in requesting deletion of the personal data.
Some frameworks include a right to data portability: the ability to easily move, copy, or transfer personal data from one controller to another. Portability is intended to reduce the risk that the data subject is locked into a particular service or service provider because the provider has accumulated useful or necessary personal data. It thus lowers barriers to switching to another service provider. For example, if an individual tracks her or his physical activity data using a wearable device linked to an app, the individual should be able to transfer that data to a competing app.
Some countries are introducing data portability requirements in financial services. Many digital credit services make credit decisions based on linked mobile money transaction histories. A right to data portability would allow such customers of one mobile money service to use their transaction histories with an unrelated digital credit service. This can be especially relevant for women who are less likely than men to have physical assets they can use as loan collateral, but who can leverage their digital transaction history as an alternative source to prove their creditworthiness.
Some personal data processing incorporates computer algorithms that sort and analyze data to make decisions about data subjects. These decisions can be subject to errors and bias resulting from training data that is erroneous, out-of-date or biased; erroneous data about the data subject; or errors or biases in the algorithms themselves. Some decisions that rely on profiling based on factors such as race, ethnicity, or religion would be unlawfully discriminatory if the decision were made by a person. To address these risks, many frameworks provide data subjects with a right not to be subject to decisions based solely on automated processing of personal data that result in legal consequences for the data subject. Examples include automatic refusal of loans submitted via online applications and electronic recruitment practices that are concluded without human intervention.
Data protection frameworks typically provide data subjects with a right to object to processing of personal data. When such an objection is validly made, the controller will typically need to cease any such processing.
+ 6. What data protection covers
Geographic connections
There are practical and legal limits to applying domestic law outside the jurisdiction. Data protection laws typically require a minimum level of connection to the territory. In many countries, the law only applies when controllers and/or processors are established in the territory, processing takes place within the territory, or data subjects are targeted or monitored within the territory. Applicability based solely on the location of the data subject is often considered an overreach.
For example, a foreign data controller operating a website but having no connection to the territory or desire to engage its residents would not want to become subject to local data protection obligations just because a resident happens to browse that website unbeknownst to the data controller. Accordingly, some jurisdictions apply their data protection frameworks to foreign data controllers only when the controller engages in some active targeting of, marketing to, or monitoring of residents of that jurisdiction.
Scope of processing activities
Given the breadth of the concepts of “personal data” and “processing”, data protection could be interpreted to apply to a vast range of human activities. Many frameworks explicitly exempt the processing of personal data for personal, household, family or recreational affairs. Processing of personal data for the purposes of activities such as organizing amateur sport teams or planning family reunions is not likely to cause harm and regulating such processing would be a massive intrusion into the private lives of individuals.
Some activities are not subject to data protection law because they offer societal benefits that should be permitted, subject to some protections. Examples are processing data for journalistic, artistic or literary purposes. Government processing of personal data for purposes of national or public security, law enforcement, or other sensitive government functions may also be excluded, though usually with safeguards to mitigate abuse.
Anonymization
Where personal data can be anonymized so that it is more difficult or impossible to identify the individual to whom they relate, the rationale for protecting such data greatly diminishes and they may be processed without being subject to data protection requirements. Anonymization requires that all linkages to the data subject are permanently and irrevocably removed. By contrast, data that are merely de-identified, meaning for example that identifying information is replaced with coded information which could be de-coded to re-identify the individual, would not qualify as anonymized. However, anonymization is a dynamic field where the threshold keeps getting higher as new technologies find new ways to link anonymized data to the data subject.
+ 7. Institutions that support data protection
Data protection authorities
Data protection frameworks typically designate an agency to serve as a data protection authority or in a similar capacity. Many frameworks require that a data protection authority be independent to prevent capture by political or commercial influences. This is all the more important as public bodies collect, use, and record extensive personal data about the population when providing public services to them.
Such authorities’ functions and powers vary in different jurisdictions. They generally include monitoring compliance, receiving complaints and conducting investigations, serving enforcement notices, imposing administrative fines, issuing or advising on the issuance of regulations, engaging in public outreach efforts, and advising legislators and policy makers on data protection issues. An authority is typically funded through a combination of allocations made by the legislature and proceeds from fees or fines.
Some frameworks require controllers, and even processors, to register with a data protection authority to strengthen the authority’s information about data activities and enable it to charge fees. To avoid administrative burdens, usually this registration requirement only applies when certain thresholds are met or processing involves particularly sensitive matters.
Data protection frameworks typically allow for appeals or judicial review of adverse decisions of a data protection authority. Sometimes appeals are made to an ad hoc appeals body as an intermediary step, other times directly to a court.
Penalties and remedies
The effectiveness of obligations and protections in a data protection framework depend on a credible threat of consequences for violations. Many frameworks empower a data protection authority to impose administrative fines on controllers and processors for violations. The amounts of fines are often limited by a monetary cap or a percentage of an entity’s annual turnover (either domestic or worldwide) or both. Some frameworks permit individuals to bring direct civil claims in domestic courts against controllers and processors for damages resulting from violations. Some frameworks provide for criminal penalties consisting of fines and/or imprisonment and are applicable to individuals such as directors, officers, and managers of legal entities. Criminal penalties are more common in jurisdictions that have weaker confidence in the effectiveness of administrative fines and civil claims.
Civil society, education and culture
In addition to a legal framework, civil society engagement, education and professionalization are often vital to effect change in understanding and conduct across public bodies and commercial and non-profit organizations.
For example, the Nubian Rights Forum in Kenya has pressed for data protection laws to protect data used in proposed digital identification systems. Some jurisdictions, such as the EU, have created a specific professional category of data protection officers (DPOs) that organizations of a particular scale or nature are required to engage. DPOs have certain responsibilities, perform various functions, and are expected to be suitably trained and to register with data protection authorities. These and similar requirements promote the development of a community of knowledgeable professionals with shared understanding and approaches, integrated into public and private institutions.
+ 8. How data protection supports the digital economy
Growth in the digital economy
Increased trust in the digital realm is vital to uptake and usage of digital services. Providers also need certainty about the rules of the game. The implementation of a data protection framework may be a valuable precondition for investment in data-intensive businesses. For example, immediately after Kenya’s 2019 Data Protection Act was signed into law, Amazon Web Services announced new investments in the country, including establishing part of its data cloud infrastructure in Nairobi. The company reportedly characterized the new law as paving the way for the investment, noting that it had been awaiting such a law for seven years.
Confidence in government services
A data protection framework can also increase confidence that government uses of personal data will not result in unwarranted surveillance, profiling, or other discrimination. For example, national ID systems implemented in some developing countries have been heavily criticized when they were implemented without a robust data protection framework. In India, Jamaica, and Kenya, recent court decisions even invalidated or limited the adoption of national ID systems, largely because of insufficient protection of personal data. Concerns about data protection underlie lack of take-up of contact tracing applications in the recent Covid-19 pandemic. For example, a recent study showed that in countries where individuals tend to distrust their governments, they have been more hesitant to download and use contact-tracing apps.
+ 9. Emerging issues
Artificial intelligence (AI) refers to computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.
Machine learning refers to the ability of such systems to progressively improve their own performance by analyzing large volumes of data, rather than through human programming. These technologies present opportunities for development of the digital economy, such as through improved credit scoring that supports financial inclusion, or better fraud detection. However, these systems use vast amounts of data, and it may be difficult or impossible to know what data is being processed, how it is being processed, or how decisions are generated about individuals. These systems thus present challenges for many of the key protections of data protection frameworks.
For example, Amazon found that its AI-based automated hiring software was unintentionally favoring male candidates. The software was “trained” on resumes of past applicants, which were predominantly male, leading it to penalize female candidates.³⁸ Potential solutions to these sorts of issues include addressing the type of training data used and ensuring there is a “human in the loop” when decisions are made. See Center for Information Policy Leadership, Artificial Intelligence and Data Protection in Tension, 2018; Center for Democracy & Technology, AI & Machine Learning, 2020.
Facial recognition technology (FRT) refers to computer systems that can process images of human faces to identify, authenticate, or categorize an individual. While presenting opportunities for digital identification and verification that may support development of the digital economy, FRT raises many data protection challenges.
First, cameras have become increasingly ubiquitous through government surveillance, private-sector security systems, and consumer products such as smart doorbells and smartphones. Individuals expose their face whenever they are in public, opening themselves up to surveillance and processing of their personal information, often without their consent.
Second, by its nature, FRT can discern sensitive personal information of individuals, including gender, race, ethnicity, and health status (as well as data that is not sensitive, such as location).
Finally, FRT is not always accurate, particularly when identifying faces of certain population groups, potentially resulting in misidentification that can have legal consequences for individuals.
For example, a recent study showed that facial recognition technologies identify lighter-skin men with almost no error but had an error rate of nearly 35 percent when identifying darker-skin female faces. See National Conference of State Legislatures, Facial Recognition Gaining Measured Acceptance, 2020; Roussi, Antoaneta, Resisting the rise of facial recognition, 2020; How should we regulate facial-recognition technology?, Nature, 2019; Wiewiórowski, Wojciech, Facial recognition: A solution in search of a problem?, European Data Protection Supervisor, 2019.
+ 10. Additional resources
Data protection models
Organisations
- European Commission (data protection resources page)
For further reading
- Convention 108+ for the Protection of Individuals with regard to the Processing of Personal Data, 2018
+ 11. References
For the full list of references, please download the PDF of the brief in English or in French.