Extract from The Guardian
An excellent critique of the social media giant underlines the threat it poses to us all – and suggests how it can be tamed
The best metaphor for Facebook is the monster created by Dr Frankenstein. Mary Shelley’s story shows how, as Fiona Sampson put it in a recent Guardian
article, “aspiration and progress are indistinguishable from hubris –
until something goes wrong, when suddenly we see all too clearly what
was reasonable endeavour and what overreaching”. There are clear echoes
of this in the evolution of Facebook. “It’s a story”, writes Siva
Vaidhyanathan in this excellent critique, “of the hubris of good
intentions, a missionary spirit and an ideology that sees computer code
as the universal solvent for all human problems. And it’s an indictment
of how social media has fostered the deterioration of democratic and
intellectual culture around the world.”
Facebook was founded by an undergraduate with good intentions but little understanding of human nature. He thought that by creating a machine for “connecting” people he might do some good for the world while also making himself some money. He wound up creating a corporate monster that is failing spectacularly at the former but succeeding brilliantly at the latter. Facebook is undermining democracy at the same time as it is making Mark Zuckerberg richer than Croesus. And it is now clear that this monster, like Dr Frankenstein’s, is beyond its creator’s control.
"All uses of its services for political campaigns should be inspected"
There are, says Vaidhyanathan, “two things wrong with Facebook: how it works and how people use it”. It works by monitoring its users – hoovering up their data trails and personal information in order to paint virtual targets on their back at which advertisers (Facebook’s real customers) can take aim. People use it for all kinds of things, many of them innocuous, but some of them absolutely pernicious: disseminating hate speech that leads to ethnic cleansing in Myanmar, for example; spreading white supremacist propaganda in the US or Islamophobic or antisemitic messages in innumerable countries, and so on. People also use it to try to influence democratic elections, to threaten and harass others, to spread fake news, publish revenge porn and perform a host of other antisocial acts.
Vaidhyanathan argues that the central problem with Facebook is the pernicious symbiosis between its business model – surveillance capitalism – and the behaviour of its users. Because Facebook provides “free” services, it derives its revenues solely by monetising the data trails of its users – the photographs they upload, the status updates they post, the things they “like”, their friendship groups, the pages they follow, etc. This enables it to build detailed profiles of each user (containing 98 data points, according to one report), which can then be used for even more precisely targeted advertising.
Facebook “farms” its users for data: the more they produce – the more “user engagement” there is, in other words – the better. Consequently, there is an overriding commercial imperative to increase levels of engagement. And it turns out that some types of pernicious content are good for keeping user-engagement high: fake news and hate speech are pretty good triggers, for example. So the central problem with Facebook is its business model: the societal downsides we are experiencing are, as programmers say, a feature, not a bug.
What to do about this corporate monster is one of the great public policy questions of our day. The company has 2.2 bn users worldwide. While it may be good (or at least enjoyable) for individuals, we now have clear evidence that it’s not that good for democracy. It has no effective competitors, so it’s a monopoly – and a global one at that. And, given its business model, it has no incentive to reform itself. So what can be done about it?
One thing we already know for sure. Campaigns such as #deletefacebook won’t do the trick: the company has been largely unscathed by the Cambridge Analytica scandal. The network effect of its 2.2 bn users is just too powerful: for many people, deleting their accounts would amount to cutting themselves off from their social lives. And this has engendered a feeling that resistance is futile.
It isn’t. Although Facebook has become a leviathan, that simply means that it can only be tamed by another leviathan, in this case, the state. Vaidhyanathan argues that the key places to start are privacy, data protection, antitrust and competition law. Facebook is now too big and should be broken up: there’s no reason why it should be allowed to own Instagram and WhatsApp, for example. Regulators should be crawling over the hidden auctions it runs for advertisers. All uses of its services for political campaigns should be inspected by regulators and it should be held editorially responsible for all the content published on its site.
What’s needed, in other words, is political will, informed by a clear analysis of the social harm that this corporation is fostering. For this we need good, informed critiques such as this book. Given Facebook’s dominance, it will be a long haul, but then, as the Chinese say, the longest journey begins with a single step. Professor Vaidhyanathan has just taken it.
Facebook was founded by an undergraduate with good intentions but little understanding of human nature. He thought that by creating a machine for “connecting” people he might do some good for the world while also making himself some money. He wound up creating a corporate monster that is failing spectacularly at the former but succeeding brilliantly at the latter. Facebook is undermining democracy at the same time as it is making Mark Zuckerberg richer than Croesus. And it is now clear that this monster, like Dr Frankenstein’s, is beyond its creator’s control.
"All uses of its services for political campaigns should be inspected"
There are, says Vaidhyanathan, “two things wrong with Facebook: how it works and how people use it”. It works by monitoring its users – hoovering up their data trails and personal information in order to paint virtual targets on their back at which advertisers (Facebook’s real customers) can take aim. People use it for all kinds of things, many of them innocuous, but some of them absolutely pernicious: disseminating hate speech that leads to ethnic cleansing in Myanmar, for example; spreading white supremacist propaganda in the US or Islamophobic or antisemitic messages in innumerable countries, and so on. People also use it to try to influence democratic elections, to threaten and harass others, to spread fake news, publish revenge porn and perform a host of other antisocial acts.
Vaidhyanathan argues that the central problem with Facebook is the pernicious symbiosis between its business model – surveillance capitalism – and the behaviour of its users. Because Facebook provides “free” services, it derives its revenues solely by monetising the data trails of its users – the photographs they upload, the status updates they post, the things they “like”, their friendship groups, the pages they follow, etc. This enables it to build detailed profiles of each user (containing 98 data points, according to one report), which can then be used for even more precisely targeted advertising.
Facebook “farms” its users for data: the more they produce – the more “user engagement” there is, in other words – the better. Consequently, there is an overriding commercial imperative to increase levels of engagement. And it turns out that some types of pernicious content are good for keeping user-engagement high: fake news and hate speech are pretty good triggers, for example. So the central problem with Facebook is its business model: the societal downsides we are experiencing are, as programmers say, a feature, not a bug.
What to do about this corporate monster is one of the great public policy questions of our day. The company has 2.2 bn users worldwide. While it may be good (or at least enjoyable) for individuals, we now have clear evidence that it’s not that good for democracy. It has no effective competitors, so it’s a monopoly – and a global one at that. And, given its business model, it has no incentive to reform itself. So what can be done about it?
One thing we already know for sure. Campaigns such as #deletefacebook won’t do the trick: the company has been largely unscathed by the Cambridge Analytica scandal. The network effect of its 2.2 bn users is just too powerful: for many people, deleting their accounts would amount to cutting themselves off from their social lives. And this has engendered a feeling that resistance is futile.
It isn’t. Although Facebook has become a leviathan, that simply means that it can only be tamed by another leviathan, in this case, the state. Vaidhyanathan argues that the key places to start are privacy, data protection, antitrust and competition law. Facebook is now too big and should be broken up: there’s no reason why it should be allowed to own Instagram and WhatsApp, for example. Regulators should be crawling over the hidden auctions it runs for advertisers. All uses of its services for political campaigns should be inspected by regulators and it should be held editorially responsible for all the content published on its site.
What’s needed, in other words, is political will, informed by a clear analysis of the social harm that this corporation is fostering. For this we need good, informed critiques such as this book. Given Facebook’s dominance, it will be a long haul, but then, as the Chinese say, the longest journey begins with a single step. Professor Vaidhyanathan has just taken it.
No comments:
Post a Comment