Extract from The Guardian
Secret footage of executives boasting about psychological profiling is a red flag – our democracy really is under threat
It’s exactly the sort of conversation about politics that one would hope did not exist.
Two suits, in a swanky restaurant, blithely boasting about exploiting fears buried so deep inside our subconscious that most of us don’t even know we have them; claiming that, for the right price, they can creep invisibly into your head.
Channel 4’s sting, which showed executives at the digital marketing firm Cambridge Analytica telling undercover reporters what on earth it is they actually do, was just the final piece of this particular jigsaw. My indefatigable Observer colleague Carole Cadwalladr put in the hard yards, over months of investigating the firm that boasts of using a combination of data and behavioural science to laser-target voters and thus help put Donald Trump in the White House.
But until now it’s been difficult for many people to visualise how the unauthorised use of our personal data, or the use of social media profiles to manipulate our votes, looks in practice. It sounds bad, obviously. We clearly should care. But it’s all so complicated, and life is busy. The risks of letting tech giants plough through our holiday snaps seem abstract and remote compared with the instant gratification of social media likes and shares and gossip.
Well, it’s not so remote now. Facebook had $36bn wiped off its shares on Monday following Cadwalladr’s revelation that personal data from 50 million American Facebook users, obtained by an academic using privileged access granted for research purposes, was then passed on to Cambridge Analytica.
And now we know what sort of hands it ended up in. When approached by
undercover reporters, posing as wealthy clients seeking to get chosen
politicians elected in Sri Lanka, Cambridge Analytica
executives suggested all sorts of dubious miracles might be possible.
Maybe a rival could be made a financial offer he couldn’t refuse, with
the resulting incriminating film posted online. Or maybe some beautiful
women could be sent to his house.
But arguably more chilling was a conversation about just how deep its psychological profiling goes. Christopher Wylie, the whistleblower Cadwalladr worked with, has said the company used data harvested from prospective voters to “build models to exploit what we knew about them, and target their inner demons”. On camera we saw what that might mean: talk of operating subliminally, exploiting fears where “you didn’t know that was a fear until you saw something that just evoked that reaction”.
Cambridge Analytica’s chief executive, Alexander Nix, one of those caught on camera, now insists his company was indulging in “a certain amount of hyperbole” – either exaggerating to impress the client, or perhaps even quietly probing their intentions. Like lobbyists, his company seems to spend half its time telling clients of its power to influence elections and the other half telling journalists those powers are wildly overstated.
"Data mining is a fast-growing business operating largely unseen on the fringe of politics"
But at best, his company looks guilty of hype, and at worst, the months of legal threats and denials over Cadwalladr’s stories – not to mention the online abuse she got from people such as Arron Banks of the unofficial pro-Brexit campaign Leave.EU, said to have consulted Cambridge Analytica – stink.
None of this means Britain would have voted remain or Hillary Clinton would be president if it hadn’t been for these pesky kids, just as “corporate lobbying” can’t explain every poor government decision. Voters’ anger was real, and this scandal doesn’t absolve us from asking the hard questions about why so many responded to extreme messages. Since Cambridge Analytica’s business model is arguably just a supercharged version of something political parties have done for years – identifying potential supporters, compiling detailed pictures of what makes them tick, then tailor-making messages to different groups depending on what they want to hear – it may also turn out that all sides have been busily scraping data behind our backs.
But this feels like a tipping point. Britain’s information commissioner has a warrant to search servers. The Commons select committee inquiry into fake news will recall Nix, questioning whether he “deliberately misled” them in recent testimony on the use of the Facebook data, and seek fresh evidence from Facebook. The latter’s falling share price meanwhile reflects not just this scandal but a string of them, including its role in inadvertently spreading fake news. There is a growing sense that even if users don’t take fright, regulators are losing patience.
And so they should. Like lobbying back in the days of cash-for-questions, data mining is a fast-growing business operating largely unseen on the fringe of politics, and while it can be used to respectable ends, it’s vulnerable to abuse. It clearly has the capacity to undermine our democratic process, even if it hasn’t done so yet. We should act before it’s too late.
• Gaby Hinsliff is a Guardian columnist and former political editor of the Observer
Two suits, in a swanky restaurant, blithely boasting about exploiting fears buried so deep inside our subconscious that most of us don’t even know we have them; claiming that, for the right price, they can creep invisibly into your head.
Channel 4’s sting, which showed executives at the digital marketing firm Cambridge Analytica telling undercover reporters what on earth it is they actually do, was just the final piece of this particular jigsaw. My indefatigable Observer colleague Carole Cadwalladr put in the hard yards, over months of investigating the firm that boasts of using a combination of data and behavioural science to laser-target voters and thus help put Donald Trump in the White House.
But until now it’s been difficult for many people to visualise how the unauthorised use of our personal data, or the use of social media profiles to manipulate our votes, looks in practice. It sounds bad, obviously. We clearly should care. But it’s all so complicated, and life is busy. The risks of letting tech giants plough through our holiday snaps seem abstract and remote compared with the instant gratification of social media likes and shares and gossip.
Well, it’s not so remote now. Facebook had $36bn wiped off its shares on Monday following Cadwalladr’s revelation that personal data from 50 million American Facebook users, obtained by an academic using privileged access granted for research purposes, was then passed on to Cambridge Analytica.
But arguably more chilling was a conversation about just how deep its psychological profiling goes. Christopher Wylie, the whistleblower Cadwalladr worked with, has said the company used data harvested from prospective voters to “build models to exploit what we knew about them, and target their inner demons”. On camera we saw what that might mean: talk of operating subliminally, exploiting fears where “you didn’t know that was a fear until you saw something that just evoked that reaction”.
Cambridge Analytica’s chief executive, Alexander Nix, one of those caught on camera, now insists his company was indulging in “a certain amount of hyperbole” – either exaggerating to impress the client, or perhaps even quietly probing their intentions. Like lobbyists, his company seems to spend half its time telling clients of its power to influence elections and the other half telling journalists those powers are wildly overstated.
"Data mining is a fast-growing business operating largely unseen on the fringe of politics"
But at best, his company looks guilty of hype, and at worst, the months of legal threats and denials over Cadwalladr’s stories – not to mention the online abuse she got from people such as Arron Banks of the unofficial pro-Brexit campaign Leave.EU, said to have consulted Cambridge Analytica – stink.
None of this means Britain would have voted remain or Hillary Clinton would be president if it hadn’t been for these pesky kids, just as “corporate lobbying” can’t explain every poor government decision. Voters’ anger was real, and this scandal doesn’t absolve us from asking the hard questions about why so many responded to extreme messages. Since Cambridge Analytica’s business model is arguably just a supercharged version of something political parties have done for years – identifying potential supporters, compiling detailed pictures of what makes them tick, then tailor-making messages to different groups depending on what they want to hear – it may also turn out that all sides have been busily scraping data behind our backs.
But this feels like a tipping point. Britain’s information commissioner has a warrant to search servers. The Commons select committee inquiry into fake news will recall Nix, questioning whether he “deliberately misled” them in recent testimony on the use of the Facebook data, and seek fresh evidence from Facebook. The latter’s falling share price meanwhile reflects not just this scandal but a string of them, including its role in inadvertently spreading fake news. There is a growing sense that even if users don’t take fright, regulators are losing patience.
And so they should. Like lobbying back in the days of cash-for-questions, data mining is a fast-growing business operating largely unseen on the fringe of politics, and while it can be used to respectable ends, it’s vulnerable to abuse. It clearly has the capacity to undermine our democratic process, even if it hasn’t done so yet. We should act before it’s too late.
• Gaby Hinsliff is a Guardian columnist and former political editor of the Observer
No comments:
Post a Comment