April 25, 2021

Privacy as a public good

Last week's post discussed the privacy paradox, which is the fact that many people voice concerns about the lack of online privacy but widely share personal data on the internet and when using connected devices. This raises the question why we fail to take actions to protect our privacy even when we claim to be worried by the lack of privacy.

It has been proposed that internet users are naive and do not understand how their data are used. This was perhaps true in the first years of social media but awareness of data usage by tech companies have greatly improved. Ignorance can therefore no longer be invoked to explain the privacy paradox.

A more useful idea has been proposed in a recent paper by Acemoglu, Makhdoumi, Malekian and Ozdaglar.[1] The idea is that privacy is a public good: when one person protects their private data, it protects other people's privacy as well.

Data sharing as an externality

When you share personal data with a company, you disclose information not only about you but also about other people who are similar to you. For instance, when Amazon keeps track of my behavior on its platform, it learns something not just about me but also about the tastes of finance academics (which I'm not going to share in this article, it's our secret!) This information is useful to forecast the behavior of other people with similar characteristics and show them targeted content.

Economists have a term for this type of phenomenon: the decision to protect or not your personal data creates an externality on other people because your data are used to train algorithms that are applied to other people.

The notion that privacy generates an externality is useful because it forces us to think about whether the externality is positive or negative; that is, whether sharing personal data is beneficial or hurt other people.

For example, if a biotech developing artificial intelligence-based detection of lung cancer has a larger database of lung scans, it will be able to develop a better algorithm to detect lung cancer that will benefit future patients. In this case, when people consent to the inclusion of their lung scans in databases used for medical research, they exert a positive externality on society.

But the data externality can also be negative. For example, when information about individuals is used for non-transparent political advertising, algorithms' greater ability to target such advertising is detrimental to democracy.

Abundance of data in the hands of a few companies can also be detrimental to consumers. Privileged access to data for big tech companies can unlevel the playing field with respect to other companies and prevent the entry of new players, hurting consumers in the form of higher price and less innovation.

Data regulation

Understanding data externalities is key to determine the right way to regulate tech companies. Europe has taken a first step with the GDPR in aknowledging that the law should make it easier for individuals to protect their online privacy.

However, the GDPR framework does not address the fact that privacy is not only a private good but also a public good. The clearance of the acquisition of Fitbit by Google by the European Commission shows that competition authorities are still taking a soft stance on the risk of negative data externalities.

 

[1] Acemoglu, Makhdoumi, Malekian and Ozdaglar, 2019, Too Much Data: Prices and Inefficiencies in Data Markets, NBER Working Paper [pdf]

 
Previous post: The privacy paradox: Why do we share our personal data? »
Next post: ChatGPT takes my finance exam »
Home »