Surveillance Capitalism: The Concerns of Big Data

The rise of big data has created a new means to generating wealth. It has also deeply altered the social and political landscape.

In the year 1997, it became more efficient to store information digitally, as opposed to on paper.

Additionally, the rise of sensory devices (IoT) has led to the “datafication” of whole industries. This has led to the dominance of the “big data industry”, with the main players in this field being Google and its likes.

Services that map user behaviour with data (like Facebook, Google Search and WhatsApp) have transformed the face of business operations and the means of generating value.

This has created a whole new brand of capitalism. Soshana Zuboff has dubbed this new species of power: “surveillance capitalism”. It works as follows:

Traditional capitalism has worked like this:

Information from customers was used to produce service improvements that speak to customer needs.

Subsequently, as industries have captured more behavioural data, the following has happened:

Increased surveillance has caused a surplus in behavioural data. This has made customers more predictable. The predictable behaviour of clients can now be monetised.

Subsequently, there has been a subtle shift in the business agenda. This new breed of capitalism looks to generate profit by selling predictive products as well. Behaviour is predicted and the outcome is sold to advertising agencies or other third parties.

Data is still used to improve services but now it has a secondary value-generating function through user predictability.

Obviously, there is a temptation to overstep in the amount and type of data gathered. This has led to some controversies.

Controversies in the Media

Social media

A myriad of recent examples can be used to highlight the growing distrust that customers have with increased surveillance. Here are some, to name a few:

  1. Facebook received backlash for the continued erosion of its data protection policy at WhatsApp.
  2. Cambridge Analytica received bad press for its use and manipulation of voter data in the 2016 American presidential election.
  3. Google’s street view car Wi-Fi scandal, Google violated U.S federal wiretap laws by stealing Wi-Fi information.

The list could be a legion. These top three examples give the reader a sense of the issue: data being used to generate predictability in applications that don't prioritise the interests of the user.

The “Big Other” vs “Big Brother” and Surveillance Capitalism

The all-seeing eye

The Big Brother of the 20th Century

For the past 70 years, since Orwell’s “1984”, the world has obsessed over the consequences of totalitarian societies. The term“Big Brother” became popularised.

This is a society dominated by ideological possession centring around fanatical notions of perfection or utopia (Nazism/Sovietism). Furthermore, the masses are controlled through terror and need a common enemy to rally behind.

Totalitarianism/authoritarian regimes are not a thing of the past but are becoming an antiquated form of mass power. Rather, a new form of mass power has arisen in the 21st century.

The Big Other of the 21st Century

Zuboff characterizes surveillance capitalism as a new species of mass power that is similar in magnitude to totalitarian societies - i.e some big data industries reach and influence many more people than most nations.

However, this power is fundamentally different – it can be dubbed the “The Big Other” when expressed as purely exploitative and extractive. Extractive surveillance capitalism can be characterised as one that prioritises predicting user behaviour over user interests.

When brought to its conclusion it is producing societies characterized by total certainty of outcomes (predictability of behaviour) and a radical indifference to the individual. Additionally, populations exhibit a hive mind – one that is hypersensitized to social comparison and dopamine feedback.

One does not need to think hard about the increased cases of depression created by social media and device addiction. These mechanisms make users more predictable (by knowing what one admires and having them acutely attuned to it).

Additionally, we have observed the rise of political polarisation and an epidemic of misinformation. This has been caused by filters in social media that iteratively make millions of users more predictable, by displaying content they will click on. Exasperating and emotive content, seen in strong political beliefs, makes the behaviour more predictable.

Data Science Practitioners

That being said, how does a data science practitioner walk this fine line? Ultimately, they are the ones responsible for producing these predictive products.

Simply put, they have AT LEAST the ethical responsibility to:

  1. Ensure that the FULL social impact of their algorithm is explored or simulated (even in the long term)
  2. Ensure that they agree with the intended use of the mathematical products they create (issue of conscience)
  3. Ensure that user behaviour is protected and processed legally


As a means to generate wealth, the rise of surveillance has transformed our societies at a rapid pace. Currently, society is navigating this new world of hyper-conformity and predictable outcomes.

Data scientists are faced with the ever difficult stances of:

  1. When to expand the information used for predictions?
  2. When to limit the case for machine learning?
  3. How to prioritise user interests over business interests?

The above discussion looks to sensitize (albeit briefly) the issues at play in a new society dominated by all-seeing eyes.

Ultimately the decision for a more human future lies in the hands of the technologists who yield the technology.


Zuboff, S. (2019). “The Age of Surveillance Capitalism, The Fight for a Human Future at the New Frontier of Power”, Profile Books Ltd, London.

Enjoyed this read?

Stay up to date with the latest AI news, strategies, and insights sent straight to your inbox!

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.