How to use responsibly
The data generated by our smart devices are essentially a private good held
by Big Tech companies that dominate social media, online sales, and search
tools. Given how valuable these data are, it is not surprising that
companies tend to keep them to themselves (Jones and Tonetti 2020). As more
data beget better analysis, which in turn attracts more usage, more data,
and more profits, these swollen data war chests fortify their platform
networks and potentially stifle competition.
This finders keepers model tends to lead to too much data being collected,
but the data are also insufficiently utilized exactly when they
could be most helpful, kept in private silos while public needs remain
unmet. Data sharing can support the development of new technologies,
including in the life sciences. Consider how epidemiological research can
benefit from scaling up big data analytics. A single researcher analyzing
the experience of patients in their home country may be a good start, but
it cannot rival the work of many researchers working together and drawing
on the experience of many more patients from around the world—the key to
the success of a number of cross-border collaborations.
How can data be made more of a public good? Commercial interests and
incentives for innovation must be balanced with the need to build public
trust through protection of privacy and integrity. Clarifying the rules of
the data economy is a good place to start. Significant advances have
resulted, for example, from the 2018 implementation in Europe of the
General Data Protection Regulation (GDPR), which clarified a number of
rights and obligations governing the data economy. EU residents now have
the right to access their data and to limit how it is processed, and these
rights are being enforced with increasingly heavy fines. But even as
researchers have started to see the impact of the GDPR on the digital
economy, there are still concerns about how to operationalize these rights
and keep them from being simply a box-checking exercise.
People should have more agency over their individual data. There could be a
case to consider the creation of public data utilities—perhaps as an
outgrowth of credit registries—that could balance public needs with
individual rights. Imagine an independent agency tasked with collecting and
anonymizing certain classes of individual data, which could then be made
available for analysis, subject to the consent of interested parties. Uses
could include contact tracing to fight pandemics, better macroeconomic
forecasting, and combating money laundering and terrorism financing.
Policies can also help consumers avoid becoming hostage within individual
ecosystems, thus contributing to market contestability and competition. The
European Union’s late-2020 proposals for the Digital Markets Act and the Digital Services Act have many new
features. These include third-party interoperability requirements for Big
Tech “gatekeepers”—including social media and online marketplaces—in certain situations and efforts to make it easier for their
customers to port their data to different platforms.
Policies also have a role to play in keeping data secure from cyberattacks.
An individual company does not fully internalize the harm to public trust
in the entire system when its customers’ data are breached, and may thus
invest less in cybersecurity than what would be in the public interest.
This concern has special resonance in the financial system, where
maintaining public confidence is crucial. This is why secure
infrastructure, cybersecurity standards, and regulation are essential
pillars of the open banking policies many countries have adopted to
facilitate interoperability in sensitive financial data.