Contact Form

 

Cambridge Analytica: The story so far


Image copyright Getty Images

It's a sensational story containing allegations of sleaze, psychological manipulation and data misuse that has provoked an internationally furious response.

Tech giant Facebook and data analytics firm Cambridge Analytica are at the centre of a dispute over the harvesting and use of personal data - and whether it was used to influence the outcome of the US 2016 presidential election or the UK Brexit referendum.

Both firms deny any wrongdoing.

How has Cambridge Analytica been accused of sleazy tactics?

Channel 4 News sent an undercover reporter to meet executives from data analytics firm Cambridge Analytica.

The firm had been credited with helping Donald Trump to presidential victory.

The reporter posed as a Sri Lankan businessman wanting to influence a local election.

Cambridge Analytica boss Alexander Nix was apparently filmed giving examples of how his firm could discredit political rivals by arranging various smear campaigns, including setting up encounters with prostitutes and staging situations in which apparent bribery could be caught on camera.

Media playback is unsupported on your device Media caption Alexander Nix, CEO, Cambridge Analytica: "These sort of tactics are very effective"

The firm denies all the claims and says the documentary was "edited and scripted to grossly represent the nature of those conversations". It claims the conversations were led by the reporters.

"I must emphatically state that Cambridge Analytica does not condone or engage in entrapment, bribes or so-called 'honeytraps', and nor does it use untrue material for any purpose," said Mr Nix.

What was Facebook's role?

In 2014 a quiz on Facebook invited users to find out their personality type.

Image copyright Getty Images

It was developed by University of Cambridge academic Aleksandr Kogan (the university has no connections with Cambridge Analytica).

As was common with apps and games at that time, it was designed to harvest not only the user data of the person taking part in the quiz, but also the data of their friends.

Facebook has since changed the amount of data developers can scrape in this way.

Christopher Wylie, who worked with Cambridge Analytica, alleges that because 270,000 people took the quiz, the data of some 50 million users, mainly in the US, was harvested without their explicit consent via their friend networks.

Mr Wylie claims the data was sold to Cambridge Analytica, which then used it to psychologically profile people and deliver pro-Trump material to them.

Cambridge Analytica denies any of it was used as part of the services it provided to the Trump campaign.

Is this against Facebook's terms?

The data was gathered using Facebook's infrastructure at that time, and many other developers had taken advantage of it - but the data was not authorised for them to share with others.

The other key point is that even the people directly taking part in the personality quiz would have had no idea that they were potentially sharing their data with Donald Trump's election campaign.

Facebook say when they learned their rules had been breached, they removed the app and demanded assurances that the information had been deleted.

Cambridge Analytica claims that it never used the data, and deleted it when Facebook told it to.

Both Facebook and the UK Information Commissioner want to find out whether it was properly destroyed, as Mr Wylie claims it was not.

What has the official response been?

Image copyright Getty Images Image caption There are calls for Facebook founder Mark Zuckerberg to testify before Congress

US senators have called on Mark Zuckerberg to testify before Congress about how Facebook will protect users.

The head of the European Parliament said it would investigate to see if the data was misused.

A spokesman for Prime Minister Theresa May said she was "very concerned" about the revelations.

How can you protect your data?

There are a few things to be aware of if you want to restrict who has access to your data.

Keep an eye on apps, especially those which require you to log in using your Facebook account - they often have a very wide range of permissions and many are specifically designed to pick up your data

Use an ad blocker to limit advertising

Look at your Facebook security settings and make sure you are aware of what is enabled. Check the individual app settings to see whether you have given them permission to view your friends as well as yourself.

You can download a copy of the data Facebook holds on you, although it is not comprehensive. There is a download button at the bottom of the General Account Settings tab. However bear in mind that your data may be less secure sitting on your laptop than it is on Facebook's servers, if your device is hacked.

You can of course, simply leave Facebook, but the campaign group Privacy International warns that privacy concerns extend beyond the social network.

"The current focus is on protecting your data being exploited by third parties, but your data is being exploited all the time," said a spokeswoman.

"Many apps on your phone will have permission to access location data, your entire phone book and so on. It is just the tip of the iceberg."


It’s exactly the sort of conversation about politics that one would hope did not exist.

Two suits, in a swanky restaurant, blithely boasting about exploiting fears buried so deep inside our subconscious that most of us don’t even know we have them; claiming that, for the right price, they can creep invisibly into your head.

Cambridge Analytica boasts of dirty tricks to swing elections Read more

Channel 4’s sting, which showed executives at the digital marketing firm Cambridge Analytica telling undercover reporters what on earth it is they actually do, was just the final piece of this particular jigsaw. My indefatigable Observer colleague Carole Cadwalladr put in the hard yards, over months of investigating the firm that boasts of using a combination of data and behavioural science to laser-target voters and thus help put Donald Trump in the White House.

But until now it’s been difficult for many people to visualise how the unauthorised use of our personal data, or the use of social media profiles to manipulate our votes, looks in practice. It sounds bad, obviously. We clearly should care. But it’s all so complicated, and life is busy. The risks of letting tech giants plough through our holiday snaps seem abstract and remote compared with the instant gratification of social media likes and shares and gossip.

Well, it’s not so remote now. Facebook had $36bn wiped off its shares on Monday following Cadwalladr’s revelation that personal data from 50 million American Facebook users, obtained by an academic using privileged access granted for research purposes, was then passed on to Cambridge Analytica.

And now we know what sort of hands it ended up in. When approached by undercover reporters, posing as wealthy clients seeking to get chosen politicians elected in Sri Lanka, Cambridge Analytica executives suggested all sorts of dubious miracles might be possible. Maybe a rival could be made a financial offer he couldn’t refuse, with the resulting incriminating film posted online. Or maybe some beautiful women could be sent to his house.

But arguably more chilling was a conversation about just how deep its psychological profiling goes. Christopher Wylie, the whistleblower Cadwalladr worked with, has said the company used data harvested from prospective voters to “build models to exploit what we knew about them, and target their inner demons”. On camera we saw what that might mean: talk of operating subliminally, exploiting fears where “you didn’t know that was a fear until you saw something that just evoked that reaction”.

Cambridge Analytica’s chief executive, Alexander Nix, one of those caught on camera, now insists his company was indulging in “a certain amount of hyperbole” – either exaggerating to impress the client, or perhaps even quietly probing their intentions. Like lobbyists, his company seems to spend half its time telling clients of its power to influence elections and the other half telling journalists those powers are wildly overstated.

Data mining is a fast-growing business operating largely unseen on the fringe of politics

But at best, his company looks guilty of hype, and at worst, the months of legal threats and denials over Cadwalladr’s stories – not to mention the online abuse she got from people such as Arron Banks of the unofficial pro-Brexit campaign Leave.EU, said to have consulted Cambridge Analytica – stink.

None of this means Britain would have voted remain or Hillary Clinton would be president if it hadn’t been for these pesky kids, just as “corporate lobbying” can’t explain every poor government decision. Voters’ anger was real, and this scandal doesn’t absolve us from asking the hard questions about why so many responded to extreme messages. Since Cambridge Analytica’s business model is arguably just a supercharged version of something political parties have done for years – identifying potential supporters, compiling detailed pictures of what makes them tick, then tailor-making messages to different groups depending on what they want to hear – it may also turn out that all sides have been busily scraping data behind our backs.

But this feels like a tipping point. Britain’s information commissioner has a warrant to search servers. The Commons select committee inquiry into fake news will recall Nix, questioning whether he “deliberately misled” them in recent testimony on the use of the Facebook data, and seek fresh evidence from Facebook. The latter’s falling share price meanwhile reflects not just this scandal but a string of them, including its role in inadvertently spreading fake news. There is a growing sense that even if users don’t take fright, regulators are losing patience.

And so they should. Like lobbying back in the days of cash-for-questions, data mining is a fast-growing business operating largely unseen on the fringe of politics, and while it can be used to respectable ends, it’s vulnerable to abuse. It clearly has the capacity to undermine our democratic process, even if it hasn’t done so yet. We should act before it’s too late.

• Gaby Hinsliff is a Guardian columnist and former political editor of the Observer


It is easy to be misled into believing that the Cambridge Analytica story is about rogue data miners taking advantage of an innocent Facebook. Facebook’s decision to suspend Cambridge Analytica’s access, the use of terms like “data breach”, and a good deal of coverage in the media seems to follow these lines. That, however, misses the key point. This is not a data breach by any means – and nor is it something that could not have been predicted or could easily have been avoided. This is, in many ways, Cambridge Analytica using Facebook exactly as the social media platform was designed to be used. This is how Facebook works.

Three key parts of Facebook’s model come into play: gathering data from people in order to profile them, both en masse and individually, designing systems that allow that data to be used to target people for advertising and content, then allowing third parties (generally advertisers) to use the data and those targeting systems for their own purposes. The power of these systems is often underestimated, but Facebook themselves know it, and have tested it in a number of ways.

They have demonstrated, through their “emotional contagion” experiment in 2014, that they can make people happier or sadder, simply by manipulating the order things appear in people’s timelines. They have demonstrated that they can make people more likely to vote, testing it in the 2010 US congressional elections. They can profile people based on the most mundane of information – the sheer scale of Facebook’s user-base and the amount of information given to them means that “big data” analysis can make connections that might seem bizarre, revealing insights into intelligence, politics, ethnicity and religion without people actually discussing any of those things directly.

They allow advertisements to be targeted to particular “racial affinity” groups – or tailored according to “racial affinity”. Not actual race, because that might conflict with various laws, but the race that your profile suggests you have the most “affinity” towards. Racial profiling without the name.

Cambridge Analytica : Chris Wylie tells Channel 4 News data for 50 million Facebook profiles was obtained

This all seems relatively harmless when it is just restricted to advertising for products – it might be a bit creepy to find carefully targeted advertisements for holidays in places you like or musicians you admire – but a few changes in parameters for targeting change things completely. The benefits of this profiling for electoral purposes are huge. Profiling the electorate has long been a part of political campaigning, but this makes it much more detailed, much more surreptitious, and much more effective.

Parties can target and influence voters directly; make their own supporters happier and opponents’ supporters sadder; make their own more likely to vote. They can spread stories tailored to individuals’ views, focussing on the aspects of a campaign that they know the individual cares about – both positively and negatively. When you add “fake news” to this, the situation becomes even worse.

That is the real point here. When thought about in terms of profiling and micro-targeting advertising for products, this just sounds efficient and appropriate, and harmless. It is a tiny shift, however to take this into politics – and a shift that groups like Cambridge Analytica found easy to do. All they did was understand how Facebook works, and use it. On a big scale, and in a big way, but this is how Facebook works. Profiling, targeting, persuasive manipulation are the tools of the advertiser on steroids, provably effective and available to those with the money and intent to use them. Unless Facebook changes its entire business model, it will be used in ways that interfere with our lives – and in particular that interferes with our politics.

What is more, it is only going to get more effective. Facebook is gathering more data all the time – including through its associated operations in Instagram, WhatsApp and more. Its analyses are being refined and becoming more effective all the time – and more people like Cambridge Analytica are becoming aware of the possibilities and how they might be used.

How it might be stopped, or at least slowed down, is another matter. This is all based on the fundamental operations of Facebook, so while we rely on Facebook, it is hard to see a way out. By choosing to let ourselves become dependent, we have built this trap for ourselves. Until we find our way out of Facebook, more of this is inevitable.

Paul Bernal is Senior Lecturer in IT, IP and Media Law, UEA Law School


How did Cambridge Analytica get 50 million people's data?

With a little outside help. To understand the story, we need to rewind to 2014 when Aleksandr Kogan -- a psychology researcher at Cambridge University -- created a Facebook app called "thisisyourdigitallife" with a personality test that spit out some kind of personal prediction at the end. To build credibility, the app was originally labeled a "research app used by psychologists," but that was only part of the truth -- as it turned out, Cambridge Analytica covered more than $800,000 of Kogan's app development costs. (Kogan also got to keep a copy of the resulting data for his trouble.) Some US Facebook users took the personality test as a result of ads on services like Amazon's Mechanical Turk and were paid for their efforts, but it's unclear how many chose to take the test on their own.

All told, some 270,000 US users took the test, but CA obviously walked away with data on many more people than that. That's thanks to a very specific Facebook peculiarity.

If you have a Facebook account, you've almost certainly used Facebook Login before -- it lets you create an account with a third-party app or service (or log into an existing account) with your Facebook credentials. It's incredibly convenient, but by using Facebook Login, you're tacitly giving developers of Facebook apps access to certain kinds of information about yourself -- email address and public profile data, for instance, are available to developers by default.

In 2014, however, using Facebook Login didn't just mean you were offering up your own data -- some data about the people in your social network was up for grabs too. (Facebook later deprecated the API that let this happen because, well, it's just creepy.) Those thousands of people who logged in to Kogan's app and took the test might have gotten the personality predictions they were looking for, but they paid for them with information about their friends and loved ones. Whether those results were ultimately valuable is another story. Kogan himself later said in an email to Cambridge coworkers (recently obtained by CNN) that he had provided "predicted personality scores" of 30 million American users to CA's parent company, but the results were "6 times more likely to get all 5 of a person's personality traits wrong as it was to get them all correct."

Was this really a data breach?

For better or worse, no. Facebook's official line is that calling this a breach is "completely false," since the people who signed up for Kogan's app did so willingly. As a result, that the information gained through those app logins was obtained within the scope of Facebook's guidelines. In other words, despite how shady all of this seems, the system worked exactly the way it was supposed to. The breakdown happened later when Kogan broke Facebook's rules and provided that information to Cambridge Analytica.

What has Facebook done about all this?

When all of this went down, very little -- in public, anyway. In a statement in its online newsroom, Facebook admits that it learned about Kogan and Cambridge Analytica's "violation" in 2015 and "demanded certifications from Kogan and all parties he had given data to that the information had been destroyed." As it turns out, some of that personal data might not have been deleted after all -- Facebook says it is "aggressively" trying to determine whether that's true.

More troubling is the fact that, as noted by Guardian reporter Carole Cadwalladr in an interview with CBS, Facebook never contacted any of the users involved. (She also added that Facebook took threatened to sue The Guardian to prevent an exposé from being published, which obviously isn't a good look.) Facebook VR/AR VP Andrew "Boz" Bosworth posted a rough timeline of the events (along with answers to certain FB-centric questions) earlier today, and it seems likely that timeline will remain a point of focus as investigations continue.

Finally, on March 16, a day before many of the biggest Cambridge Analytica stories broke, Facebook suspended accounts belonging to CA and its parent firm. The move is widely read as an attempt on Facebook's part to clean up some of the mess before The Guardian and The New York Times ran their most damning reports. Then, in a somewhat unexpected move, Facebook also disabled Christopher Wylie's account and prevented him from using Whatsapp, the popular messaging app Facebook acquired in 2014. (Consider this a brief reminder of how much of your social world Facebook currently owns.)

Beyond that, some Facebook execs spent the weekend asserting that there was no actual data breach. Meanwhile, CEO Mark Zuckerberg hasn't said anything about the unfolding situation, though we can imagine his silence can't last for too much longer.

So what happens now?

Scrutiny, and lots of it. Now that all of this is out in the open, powerful people are taking an interest. On March 18, Senator Amy Klobuchar (D-Minnesota) tweeted that Facebook CEO Mark Zuckerberg should testify in front of the Senate Judiciary Committee, adding that "it's clear these platforms can't police themselves." Senator Ron Wyden (D-Oregon) sent a letter (PDF) to Zuckerberg the following day, requesting information like the number of times similar incidents have occurred within the past ten years and whether Facebook has ever notified "individual users about inappropriate collection, retention or subsequent use of their data by third parties."

Facebook breach: This is a major breach that must be investigated. It's clear these platforms can't police themselves. I've called for more transparency & accountability for online political ads. They say "trust us." Mark Zuckerberg needs to testify before Senate Judiciary. — Amy Klobuchar (@amyklobuchar) March 17, 2018

Meanwhile, in the UK, Cambridge Analytica CEO Alexander Nix recently told the Parliament's Digital, Culture, Media and Sport Committee that the company did not collect people's personal information through Facebook without consent. Since Nix lied, Committee chairman Damian Collins has accused the CEO of peddling false statements and has called CA whistleblower Christopher Wylie to offer evidence to parliament. Even more promising, UK Information Commissioner Elizabeth Denham confirmed that she is seeking a warrant to access Cambridge Analytica's servers.

BREAKING: Damian Collins, chair of UK parliament's news inquiry, has called Cambridge Analytica whistleblower @chrisinsilico to give evidence next week to parliament. I predict: fireworks. — Carole Cadwalladr (@carolecadwalla) March 19, 2018

And as far as Facebook is concerned, there are few people more powerful than its shareholders. We wouldn't be surprised to see Facebook's financials bounce around for a while, too -- as I write this, the company's share price is down nearly 7 percent, shaving billions of dollars off Facebook's market cap (and eating away at Zuckerberg's net worth.) The entire core of its business is built on a foundation of trust with its users, and incidents like this can do serious damage to that trust.

This is scary -- should I keep using Facebook?

Honestly, you should at least give serious consideration to deleting your account. If you're a Facebook user, then you and all of your Facebooking friends are collectively the single most valuable thing the company has. Its fortunes rise and fall when its user numbers ebb and flow. The old internet adage says "if you're not paying, you're the product," and this is a perfect example of that.

The data we offer Facebook freely is a commodity to be accessed, mashed up, scraped and targeted against. For some, the value of the platform is enough to override the dangers. There's nothing wrong with that, and it's worth taking a few minutes to dig into account's privacy, app and ad settings to limit the amount of data you unknowingly offer to the machine. But there's also nothing wrong with saying enough is enough. None of us will cough up a cent to get bombarded by fake news, inane quizzes and game requests. But that doesn't mean we aren't paying for Facebook.

Total comment

Author

fw

0   comments

Cancel Reply