Skip to main content
Log in

The General Data Protection Regulation in the Age of Surveillance Capitalism

  • Original Paper
  • Published:
Journal of Business Ethics Aims and scope Submit manuscript

Abstract

Clicks, comments, transactions, and physical movements are being increasingly recorded and analyzed by Big Data processors who use this information to trace the sentiment and activities of markets and voters. While the benefits of Big Data have received considerable attention, it is the potential social costs of practices associated with Big Data that are of interest to us in this paper. Prior research has investigated the impact of Big Data on individual privacy rights, however, there is also growing recognition of its capacity to be mobilized for surveillance purposes. Our paper delineates the underlying issues of privacy and surveillance and presents them as in tension with one another. We postulate that efforts at controlling Big Data may create a trade-off of risks rather than an overall improvement in data protection. We explore this idea in relation to the principles of the European Union’s General Data Protection Regulation (GDPR) as it arguably embodies the new ‘gold standard’ of cyber-laws. We posit that safeguards advocated by the law, anonymization and pseudonymization, while representing effective counter measures to privacy concerns, also incentivize the use, collection, and trade of behavioral and other forms of de-identified data. We consider the legal status of these ownerless forms of data, arguing that data protection techniques such as anonymization and pseudonymization raise significant concerns over the ownership of behavioral data and its potential use in the large-scale modification of activities and choices made both on and offline.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. The pithy stave “to be let alone” was originally coined by the then US Supreme court justice Thomas Cooley (1888).

  2. We refer to these documents as “WP,” see General Data Protection Regulation References for the full citations.

  3. Data controllers are those individuals or organizations responsible for personal data, and data processors are persons or organizations who use personal data as instructed by data controllers.

  4. https://ec.europa.eu/commission/sites/beta-political/files/data-protection-factsheet-changes_en.pdf.

  5. The collection and trade of behavioral data, above the company’s own internal requirements for product and service improvements.

  6. While the state is still a major facilitator and participant in this surveillance model (West 2017) it is no longer the central player.

  7. A Latin term that means an object or property outside the legal rights framework.

General Data Protection Regulation References

  • Art. 29 WP 55 (2002). Working document on the surveillance of electronic communications in the workplace (No. 5401/01/EN/Final). Brussels, Belgium: Article 29 Data Protection Working Party, European Commission, Directorate General Justice.

  • Art. 29 WP 203 (2013). Opinion 03/2013 on purpose limitation (No. 00569/13/EN). Brussels, Belgium: Article 29 Data Protection Working Party, European Commission, Directorate General Justice.

  • Art. 29 WP 215 (2014). Opinion 04/2014 on surveillance of electronic communications for intelligence and national security purposes (No. 819/14/EN WP 215). Brussels, Belgium: Article 29 Data Protection Working Party, European Commission, Directorate General Justice.

  • Art. 29 WP 216 (2014). Opinion 05/2014 on anonymization techniques (No. 0829/14/EN). Brussels, Belgium: Article 29 Data Protection Working Party, European Commission, Directorate General Justice.

  • Art. 29 WP 221 (2014). Statement on Statement of the WP29 on the impact of the development of big data on the protection of individuals with regard to the processing of their personal data in the EU (No. 14/EN WP 221). Brussels, Belgium: Article 29 Data Protection Working Party, European Commission, Directorate General Justice.

  • Art. 29 WP 228 (2014). Working document on surveillance of electronic communications for intelligence and national security purposes (No. 14/EN WP 228). Brussels, Belgium: Article 29 Data Protection Working Party, European Commission, Directorate General Justice.

  • Art. 29 WP 237 (2016). Working document 01/2016 on the justification of interferences with the fundamental rights to privacy and data protection through surveillance measures when transferring personal data (European Essential Guarantees) (No. 16/EN WP 237). Brussels, Belgium: Article 29 Data Protection Working Party, European Commission, Directorate General Justice.

  • Art. 29 WP 251 (2017). Guidelines on automated individual decision-making and Profiling for the purposes of Regulation 2016/679 (No. 17/EN WP 251). Brussels, Belgium: Article 29 Data Protection Working Party, European Commission, Directorate General Justice.

References

  • Andrejevic, M. (2014). Big data, big questions: The big data divide. International Journal of Communication, 8, 17.

    Google Scholar 

  • Andrejevic, M., & Gates, K. (2014). Big data surveillance: Introduction. Surveillance & Society, 12(2), 185.

    Google Scholar 

  • Barocas, S., & Nissenbaum, H. (2014). Big data’s end run around anonymity and consent. Privacy, Big Data, and the Public Good: Frameworks for Engagement, 1, 44–75.

    Google Scholar 

  • Baruh, L., & Popescu, M. (2017). Big data analytics and the limits of privacy self-management. New Media & Society, 19(4), 579–596.

    Google Scholar 

  • Berk, R. (2012). Criminal justice forecasts of risk: A machine learning approach. New York: Springer.

    Google Scholar 

  • Blumenstock, J., Cadamuro, G., & On, R. (2015). Predicting poverty and wealth from mobile phone metadata. Science, 350(6264), 1073–1076.

    Google Scholar 

  • Boyd, D., & Crawford, K. (2012). Critical questions for big data: Provocations for a cultural, technological, and scholarly phenomenon. Information, Communication & Society, 15(5), 662–679.

    Google Scholar 

  • Calmon, F. P., Makhdoumi, A., & Médard, M. (2015). Fundamental limits of perfect privacy. In June 2015 IEEE international symposium on information theory (ISIT) (pp. 1796-1800). IEEE.

  • Carr, M. (2015). Power plays in global internet governance. Millennium, 43(2), 640–659.

    Google Scholar 

  • Chaudhuri, A. (2016). Internet of things data protection and privacy in the era of the General Data Protection Regulation. Journal of Data Protection & Privacy, 1(1), 64–75.

    Google Scholar 

  • Cooley, T. M. (Ed.). (1888). A treatise on the law of torts, or the wrongs which arise independent of contract. 2nd edn. Chicago: Callaghan & Co.

    Google Scholar 

  • CNBC. (2019). Austrian data privacy activist files complaint against Apple, Amazon, others. Retrieved January 18, 2019, from https://www.cnbc.com/2019/01/18/austrian-data-privacy-activist-files-complaint-against-apple-amazon-others.html.

  • Englehardt, S., & Narayanan, A. (2016). Online tracking: A 1-million-site measurement and analysis. In Proceedings of the 2016 ACM SIGSAC conference on computer and communications security (pp. 1388–1401). ACM.

  • Etzioni, A. (2018). Apple: Good business, poor citizen? Journal of Business Ethics, 151(1), 1–11.

    Google Scholar 

  • Federal Trade Commission. (2016). In the matter of genesis toys and nuance communications—Complaint and request for investigation. Submitted by The Electronic Privacy Information Center, The Campaign for a Commercial Free Childhood and The Center for Digital Democracy Consumers Union, 6 December, 2016.

  • Floridi, L. (2014). Open data, data protection, and group privacy. Philosophy & Technology, 27(1), 1–3.

    Google Scholar 

  • Gandy, O. H., Jr. (2012). Statistical surveillance: Remote sensing in the digital age. In K. Ball, K. Haggerty, & D. Lyon (Eds.), Routledge Handbook of surveillance studies (pp. 125–132). London: Routledge.

    Google Scholar 

  • Gaudin, S. (2016). How Google Home’s “always on” will affect privacy. Computerworld. Retrieved October 6, 2016, from https://www.computerworld.com/article/3128791/how-google-homes-always-on-will-affect-privacy.html.

  • Geng, H. (2017). Internet of things and data analytics handbook. Chichester: Wiley.

    Google Scholar 

  • Glaeser, E. L., Hillis, A., Kominers, S. D., & Luca, M. (2016). Crowdsourcing city government: Using tournaments to improve inspection accuracy. American Economic Review, 106(5), 114–118.

    Google Scholar 

  • Glaeser, E. L., Kominers, S. D., Luca, M., & Naik, N. (2018). Big data and big cities: The promises and limitations of improved measures of urban life. Economic Inquiry, 56(1), 114–137.

    Google Scholar 

  • Goodman, S. (2018). A game changer in the personal data protection in the EU. MSU Internatonal Law Review. Retrieved from https://www.msuilr.org/msuilr-legalforum-blogs/2018/2/19/a-game-changer-in-the-personal-data-protection-in-the-eu. Retrieved 23 Aug 2018.

  • Google tracks every move in location, maps even if you opt out. (2018). News Corp. Retrieved August 14, 2018, from https://www.news.com.au/technology/gadgets/mobile-phones/google-has-been-tracking-your-movements-even-if-you-told-it-not-to/news-story/bb9eb906387ffd2295e8b17b24b7d883.

  • Gritzalis, S. (2004). Enhancing web privacy and anonymity in the digital era. Information Management & Computer Security, 12(3), 255–287.

    Google Scholar 

  • Hackett, R. (2016). Everything you need to know about Google Allo’s Privacy Backlash. Fortune. Retrieved September 22, 2016, from http://fortune.com/2016/09/22/google-allo-nope/.

  • Haggerty, K. D., & Ericson, R. V. (2000). The surveillant assemblage. The British Journal of Sociology, 51(4), 605–622.

    Google Scholar 

  • Hintze, M. (2016). In defense of the long privacy statement. Md. L. Rev., 76, 1044.

    Google Scholar 

  • Hintze, M. (2017). Viewing the GDPR through a de-identification lens: A tool for compliance, clarification, and consistency. International Data Privacy Law, 8(1), 86–101.

    Google Scholar 

  • Hintze, M., & El Emam, K. (2018). Comparing the benefits of pseudonymisation and anonymisation under the GDPR. Journal of Data Protection & Privacy, 2(2), 145–158.

    Google Scholar 

  • Hull, G. (2015). Successful failure: What Foucault can teach us about privacy self-management in a world of Facebook and big data. Ethics and Information Technology, 17(2), 89–101.

    Google Scholar 

  • Kamara, I. (2017). Co-regulation in EU personal data protection: The case of technical standards and the privacy by design standardisation’mandate’. European Journal of Law and Technology, 8(1). http://ejlt.org/article/view/545/723.

  • Kitchin, R. (2014). The data revolution: Big data, open data, data infrastructures and their consequences. Thousand Oaks: Sage.

    Google Scholar 

  • Kokolakis, S. (2017). Privacy attitudes and privacy behavior: A review of current research on the privacy paradox phenomenon. Computers & Security, 64, 122–134.

    Google Scholar 

  • Koops, B.-J., & Leenes, R. (2014). Privacy regulation cannot be hardcoded. A critical comment on the ‘privacy by design’provision in data-protection law. International Review of Law, Computers & Technology, 28(2), 159–171.

    Google Scholar 

  • Kramer, A. D., Guillory, J. E., & Hancock, J. T. (2014). Experimental evidence of massive-scale emotional contagion through social networks. Proceedings of the National Academy of Sciences, 111(24), 8788–8790.

    Google Scholar 

  • Lohr, S. (2012). The age of big data. New York Times. Retrieved February 11, 2012, from https://www.nytimes.com/2012/02/12/sunday-review/big-datas-impact-in-the-world.html.

  • Lyon, D. (2003). Surveillance as social sorting: Privacy, risk, and digital discrimination. London: Psychology Press.

    Google Scholar 

  • Lyon, D. (2014). Surveillance, snowden, and big data: Capacities, consequences, critique. Big Data & Society, 1(2), 2053951714541861.

    Google Scholar 

  • Lyon, D. (2015). Surveillance after snowden. New York: Wiley.

    Google Scholar 

  • Lyon, D., Haggerty, K. D., & Ball, K. (Eds.). (2012). Routledge handbook of surveillance studies. New York: Routledge.

    Google Scholar 

  • Madden, M. (2014). Public perceptions of privacy and security in the post-Snowden era. The Pew Research Centre. Retrieved from http://assets.pewresearch.org/wp-content/uploads/sites/14/2014/11/PI_PublicPerceptionsofPrivacy_111214.pdf. Retrieved 20 Feb 2018.

  • Madden, M., Gilman, M., Levy, K., & Marwick, A. (2017). Privacy, poverty, and big data: A matrix of vulnerabilities for poor Americans. Washington University Law Review, 95, 53–125.

    Google Scholar 

  • Martin, K. (2015a). Ethical issues in the big data industry. MIS Quarterly Executive, 14(2), 67–85.

    Google Scholar 

  • Martin, K. (2015b). Privacy notices as tabula rasa: An empirical investigation into how complying with a privacy notice is related to meeting privacy expectations online. Journal of Public Policy & Marketing, 34(2), 210–227.

    Google Scholar 

  • Martin, K. (2016a). Data aggregators, consumer data, and responsibility online: Who is tracking consumers online and should they stop? The Information Society, 32(1), 51–63.

    Google Scholar 

  • Martin, K. (2016b). Do privacy notices matter? Comparing the impact of violating formal privacy notices and informal privacy norms on consumer trust online. The Journal of Legal Studies, 45(S2), S191–S215.

    Google Scholar 

  • Martin, K. (2016c). Understanding privacy online: Development of a social contract approach to privacy. Journal of Business Ethics, 137(3), 551–569.

    Google Scholar 

  • Martin, K. (2019). Trust and the online market maker: A comment on Etzioni’s Cyber Trust. Journal of Business Ethics, 156(1), 21–24.

    Google Scholar 

  • Martin, K., & Nissenbaum, H. (2016). Measuring privacy: An empirical test using context to expose confounding variables. Columbia Science and Technology Law Review, 18, 176–218.

    Google Scholar 

  • Martin, K., & Shilton, K. (2016). Putting mobile application privacy in context: An empirical study of user privacy expectations for mobile devices. The Information Society, 32(3), 200–216.

    Google Scholar 

  • Marwick, A. E. (2012). The public domain: Social surveillance in everyday life. Surveillance & Society, 9(4), 378–393.

    Google Scholar 

  • Miller, K. (2014). Total surveillance, big data, and predictive crime technology: Privacy’s perfect storm. Journal of Technology Law & Policy, 19, 105–146.

    Google Scholar 

  • Miraz, M. H., Ali, M., Excell, P. S., & Picking, R. (2015). A review on Internet of Things (IoT), Internet of everything (IoE) and Internet of nano things (IoNT). In Internet technologies and applications (ITA), 2015 (pp. 219–224). IEEE.

  • Mourby, M., Mackey, E., Elliot, M., Gowans, H., Wallace, S. E., Bell, J., et al. (2018). Are ‘pseudonymised’data always personal data? Implications of the GDPR for administrative data research in the UK. Computer Law & Security Review, 34(2), 222–233.

    Google Scholar 

  • Newkirk, D. (2018). “Apple: good business, poor citizen”: A practitioner’s response. Journal of Business Ethics, 151(1), 13–16.

    Google Scholar 

  • Newman, N. (2013). The costs of lost privacy: Consumer harm and rising economic inequality in the age of Google. William Mitchell Law Review, 40, 849–889.

    Google Scholar 

  • Nissenbaum, H. (2009). Privacy in context: Technology, policy, and the integrity of social life. Standford, CA: Stanford University Press.

    Google Scholar 

  • Obermeyer, Z., & Emanuel, E. J. (2016). Predicting the future—Big data, machine learning, and clinical medicine. The New England Journal of Medicine, 375(13), 1216–1219.

    Google Scholar 

  • Pagallo, U. (2017). The legal challenges of big data: Putting secondary rules first in the field of EU data protection. European Data Protection Law Review, 3, 36–46.

    Google Scholar 

  • Panackal, J. J., & Pillai, A. S. (2015). Adaptive utility-based anonymization model: Performance evaluation on big data sets. Procedia Computer Science, 50, 347–352.

    Google Scholar 

  • Peters, B. (2012). The big data gold rush. Forbes. Retrieved June 21, 2012, from https://www.forbes.com/sites/bradpeters/2012/06/21/the-big-data-gold-rush/.

  • Qualman, E. (2010). Socialnomics: How social media transforms the way we live and do business. New York: Wiley.

    Google Scholar 

  • Richardson, J. (2016). Law and the philosophy of privacy. Oxford: Routledge.

    Google Scholar 

  • Rosenblatt, A. (2017). How drivers shame Uber Lyft passengers. Medium, May 29, 2017.

  • Rosenblat, A., & Stark, L. (2015). Uber’s drivers: Information asymmetries and control in dynamic work. Data & Society Research Institute, 17.

  • Rotenberg, M., & Jacobs, D. (2013). Updating the law of information privacy: The new Framework of the European union. Harvard Journal of Law & Public Policy, 36, 605–652.

    Google Scholar 

  • Rouvroy, A. (2016). “Of Data and Men”. Fundamental rights and freedoms in a world of big data. Council of Europe, Directorate General of Human Rights and Rule of Law, T-PD-BUR (2015) 09REV, Strasbourg.

  • Safari, B. A. (2016). Intangible privacy rights: How Europe’s GDPR will set a new global standard for personal data protection. Seton Hall Law Review, 47, 809–848.

    Google Scholar 

  • Scott, M. (2018). Cambridge Analytica helped ‘cheat’ Brexit vote and US election, claims whistleblower. Retrieved September 10, 2018, from https://www.politico.eu/article/cambridge-analytica-chris-wylie-brexit-trump-britain-data-protection-privacy-facebook/.

  • Solove, D. J. (2007). Nothing to hide. San Diego Law Review, 44(4), 745–772.

    Google Scholar 

  • Solove, D. J. (2011). Nothing to hide: The false tradeoff between privacy and security. London: Yale University Press.

    Google Scholar 

  • Soria-Comas, J., & Domingo-Ferrer, J. (2016). Big data privacy: Challenges to privacy principles and models. Data Science and Engineering, 1(1), 21–28.

    Google Scholar 

  • Taddicken, M. (2014). The ‘privacy paradox’in the social web: The impact of privacy concerns, individual characteristics, and the perceived social relevance on different forms of self-disclosure. Journal of Computer-Mediated Communication, 19(2), 248–273.

    Google Scholar 

  • Taylor, L., Floridi, L., & Van der Sloot, B. (2016). Group privacy: New challenges of data technologies (Vol. 126)., Philosophical Study Series New York: Springer.

    Google Scholar 

  • Torra, V. (2017). Data privacy: Foundations, new developments and the big data challenge. Cham: Springer.

    Google Scholar 

  • Van Dijck, J. (2014). Datafication, dataism and dataveillance: Big Data between scientific paradigm and ideology. Surveillance & Society, 12(2), 197–208.

    Google Scholar 

  • Victor, J. (2013). The EU General Data Protection Regulation: Toward a property regime for protecting data privacy. The Yale Law Journal, 123(2), 513–528.

    Google Scholar 

  • Voigt, P., & Von dem Bussche, A. (2017). The EU General Data Protection Regulation (GDPR) (Vol. 18). Cham: Springer.

    Google Scholar 

  • Wachter, S. (2018). Normative challenges of identification in the Internet of Things: Privacy, profiling, discrimination, and the GDPR. Computer Law & Security Review, 34(3), 436–449.

    Google Scholar 

  • Warner, R., & Sloan, R. H. (2014). Self, privacy, and power: Is it all over. Tulane Journal of Technology and Intellectual Property, 17, 61–108.

    Google Scholar 

  • Warren, S. D., & Brandeis, L. D. (1890). The right to privacy. Harvard Law Review, 4(5), 193–220. https://doi.org/10.2307/1321160.

    Article  Google Scholar 

  • West, S. M. (2017). Data capitalism: Redefining the logics of surveillance and privacy. Business & Society, 58, 1–22.

    Google Scholar 

  • Whelan, G. (2019). Trust in surveillance: A reply to Etzioni. Journal of Business Ethics, 156(1), 15–19.

    Google Scholar 

  • Xiaying, M. (2019). The legal attributes of electronic data and the positioning of data in civil law. Social Sciences in China, 40(1), 82–99.

    Google Scholar 

  • Xu, L., Jiang, C., Chen, Y., Wang, J., & Ren, Y. (2016). A framework for categorizing and applying privacy-preservation techniques in big data mining. Computer, 49(2), 54–62.

    Google Scholar 

  • Young, A. L., & Quan-Haase, A. (2013). Privacy protection strategies on Facebook: The Internet privacy paradox revisited. Information, Communication & Society, 16(4), 479–500.

    Google Scholar 

  • Zarsky, T. Z. (2013). Transparent predictions. University of Illinois Law Review, 2013, 1503–1569.

    Google Scholar 

  • Zarsky, T. Z. (2016). Incompatible: The GDPR in the age of big data. Seton Hall Law Review, 47, 995–1020.

    Google Scholar 

  • Zuboff, S. (2015). Big other: Surveillance capitalism and the prospects of an information civilization. Journal of Information Technology, 30(1), 75–89.

    Google Scholar 

  • Zuboff, S. (2019). The age of surveillance capitalism: the fight for the future at the new frontier of power. New York: Profile Books.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Max Baker.

Ethics declarations

Research Involving Human and Animal Participants

This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Andrew, J., Baker, M. The General Data Protection Regulation in the Age of Surveillance Capitalism. J Bus Ethics 168, 565–578 (2021). https://doi.org/10.1007/s10551-019-04239-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10551-019-04239-z

Keywords

Navigation