Contact Form

 

The Cambridge Analytica scandal isn’t a scandal: this is how Facebook works


Image copyright Getty Images

It's a sensational story containing allegations of sleaze, psychological manipulation and data misuse that has provoked an internationally furious response.

Tech giant Facebook and data analytics firm Cambridge Analytica are at the centre of a dispute over the harvesting and use of personal data - and whether it was used to influence the outcome of the US 2016 presidential election or the UK Brexit referendum.

Both firms deny any wrongdoing.

How has Cambridge Analytica been accused of sleazy tactics?

Channel 4 News sent an undercover reporter to meet executives from data analytics firm Cambridge Analytica.

The firm had been credited with helping Donald Trump to presidential victory.

The reporter posed as a Sri Lankan businessman wanting to influence a local election.

Cambridge Analytica boss Alexander Nix was apparently filmed giving examples of how his firm could discredit political rivals by arranging various smear campaigns, including setting up encounters with prostitutes and staging situations in which apparent bribery could be caught on camera.

Media playback is unsupported on your device Media caption Alexander Nix, CEO, Cambridge Analytica: "These sort of tactics are very effective"

The firm denies all the claims and says the documentary was "edited and scripted to grossly represent the nature of those conversations". It claims the conversations were led by the reporters.

"I must emphatically state that Cambridge Analytica does not condone or engage in entrapment, bribes or so-called 'honeytraps', and nor does it use untrue material for any purpose," said Mr Nix.

What was Facebook's role?

In 2014 a quiz on Facebook invited users to find out their personality type.

Image copyright Getty Images

It was developed by University of Cambridge academic Aleksandr Kogan (the university has no connections with Cambridge Analytica).

As was common with apps and games at that time, it was designed to harvest not only the user data of the person taking part in the quiz, but also the data of their friends.

Facebook has since changed the amount of data developers can scrape in this way.

Christopher Wylie, who worked with Cambridge Analytica, alleges that because 270,000 people took the quiz, the data of some 50 million users, mainly in the US, was harvested without their explicit consent via their friend networks.

Mr Wylie claims the data was sold to Cambridge Analytica, which then used it to psychologically profile people and deliver pro-Trump material to them.

Cambridge Analytica denies any of it was used as part of the services it provided to the Trump campaign.

Is this against Facebook's terms?

The data was gathered using Facebook's infrastructure at that time, and many other developers had taken advantage of it - but the data was not authorised for them to share with others.

The other key point is that even the people directly taking part in the personality quiz would have had no idea that they were potentially sharing their data with Donald Trump's election campaign.

Facebook say when they learned their rules had been breached, they removed the app and demanded assurances that the information had been deleted.

Cambridge Analytica claims that it never used the data, and deleted it when Facebook told it to.

Both Facebook and the UK Information Commissioner want to find out whether it was properly destroyed, as Mr Wylie claims it was not.

What has the official response been?

Image copyright Getty Images Image caption There are calls for Facebook founder Mark Zuckerberg to testify before Congress

US senators have called on Mark Zuckerberg to testify before Congress about how Facebook will protect users.

The head of the European Parliament said it would investigate to see if the data was misused.

A spokesman for Prime Minister Theresa May said she was "very concerned" about the revelations.

How can you protect your data?

There are a few things to be aware of if you want to restrict who has access to your data.

Keep an eye on apps, especially those which require you to log in using your Facebook account - they often have a very wide range of permissions and many are specifically designed to pick up your data

Use an ad blocker to limit advertising

Look at your Facebook security settings and make sure you are aware of what is enabled. Check the individual app settings to see whether you have given them permission to view your friends as well as yourself.

You can download a copy of the data Facebook holds on you, although it is not comprehensive. There is a download button at the bottom of the General Account Settings tab. However bear in mind that your data may be less secure sitting on your laptop than it is on Facebook's servers, if your device is hacked.

You can of course, simply leave Facebook, but the campaign group Privacy International warns that privacy concerns extend beyond the social network.

"The current focus is on protecting your data being exploited by third parties, but your data is being exploited all the time," said a spokeswoman.

"Many apps on your phone will have permission to access location data, your entire phone book and so on. It is just the tip of the iceberg."


Downing Street has expressed its concern about the Facebook data breach involving the analytics company that worked with Donald Trump’s campaign team and that affected tens of millions of people.

No 10 weighed in on the row as almost $20bn (£14bn) was wiped off the social network company’s market cap in the first few minutes of trading on the Nasdaq stock exchange, where Facebook opened down more than 3%. By midday, the company’s share price losses had multiplied to more than $40bn, making the day its worst in more than five years.

Theresa May’s spokesman said she backed an investigation by the information commissioner, which was prompted by a whistleblower who told the Observer how Cambridge Analytica had harvested millions of Facebook profiles to influence voters through “psychographic” targeting.

Play Video 3:41 Everything you need to know about the Cambridge Analytica exposé – video explainer

The European parliament president, Antonio Tajani, said on Monday that the institution would investigate fully. Tajani urged the social media company to take more responsibility, saying on Twitter that “allegations of misuse of Facebook user data is an unacceptable violation of our citizens’ privacy rights”.

In the US, a state attorney general has called for investigations, greater accountability and regulation, while the head of the parliamentary committee in the UK investigating fake news accused Cambridge Analytica and Facebook of misleading MPs, with the secretary of state for digital, culture, media and sport warning of an end to the “wild west” of technology firms.

Revealed: 50 million Facebook profiles harvested for Cambridge Analytica in major data breach Read more

“The allegations are clearly very concerning. It’s essential people can have confidence that their personal data can be protected and used in an appropriate way,” May’s spokesman said.

“So it is absolutely right the information commissioner is investigating this matter and we expect Facebook, Cambridge Analytica and all the organisations involved to cooperate fully.”

In a statement Facebook said: “We have hired a digital forensics firm, Stroz Friedberg, to conduct a comprehensive audit of Cambridge Analytica. Cambridge Analytica has agreed to comply and afford the firm complete access to their servers and systems. We remain committed to vigorously enforcing our policies to protect people’s information.”

Cambridge Analytica said: “In 2014 we received Facebook data and derivatives of Facebook data from another company, GSR, that we engaged in good faith to legally supply data for research. After it subsequently became known that GSR had broken its contract with Cambridge Analytica because it had not adhered to data protection regulation, Cambridge Analytica deleted all the Facebook data and derivatives, in cooperation with Facebook.

The culture secretary, Matt Hancock, told the House of Commons that the revelations were “clearly very worrying” and the government was considering additional powers that the commissioner had proposed, including powers to impose further criminal sanctions and to compel testimony from individuals. Hancock said he was shocked at the speed at which Facebook had barred the whistleblower, Chris Wylie, from its platforms, including WhatsApp.

“I thought it was outrageous,” he said. “Facebook have some serious questions to answer here, and they will tell their side of the story. And to answer it by blocking an account, when we know in this house they do not act fast enough to block other accounts of obviously outrageous behaviour, I’ll tell you what, it shows that when they need to they can block things incredibly quickly and they will need to do a lot more that.”

However, Hancock said he had not seen any evidence that the activities had had an effect on the outcome of any election or referendum.

Labour’s Stephen Kinnock said that if Cambridge Analytica were proved to have been “in flagrant breach of our electoral rules, that would place a pretty huge question mark over the referendum result”.

Damian Collins MP, who chairs the digital, culture, media and sport select committee, said he would call the heads of both companies, Mark Zuckerberg and Alexander Nix, to give further testimony.

“We need to hear from people who can speak about Facebook from a position of authority that requires them to know the truth,” Collins said. “Someone has to take responsibility for this. It’s time for Mark Zuckerberg to stop hiding behind his Facebook page.”

Collins suggested the powers of the information commissioner should be beefed up to add the legal ability to force companies to provide information.

Downing Street said it would consider any formal requests to give new powers to the commissioner, Elizabeth Denham, but added her powers had only recently been reviewed.

“I haven’t seen any formal requests. The information commissioner does have significant powers, which have been enhanced in recent times, but if any formal requests were made to us I’m sure we’d consider it,” the spokesman said.

Hancock told Collins’s committee last week that following Brexit he would like to review legislation governing social media. On Monday the minister expanded on his words, telling the Telegraph that “he wild west for tech companies is over”.

Last month both Facebook representatives and Nix had told the parliamentary inquiry into fake news that the company did not have or use private Facebook data, or any data from Global Science Research (GSR). But in its statement on Friday night, explaining why it had suspended Cambridge Analytica and Wylie, Facebook said it had known in 2015 that profiles were passed to Nix’s company.

Profile Alexander Nix, CEO of Cambridge Analytica Show Hide Name Alexander James Ashburner Nix Age 42 Education Eton, then Manchester University where he studied History of Art Career Nix worked as a financial analyst in Mexico and the UK before joining SCL, a strategic communications firm in 2003. From 2007 he took over the company’s elections division, and claims to have worked on more than 40 campaignsglobally. Many of SCL’s projects are secret so that may be a low estimate. He set up Cambridge Analytica to work in America, with investment from US hedge fund billionaire Robert Mercer. He has been both hailed as a visionary -- featuring on Wired’s list of “25 Geniuses who are creating the future of business” -- and derided as a snake oil salesman. Controversies Cambridge Analytica has come under scrutiny for its role in elections on both sides of the Atlantic, working on Brexit and Donald Trump’s election team. It is a key subject in two inquiries in the UK - by the Electoral Commission, into the firm’s possible role in the EU referendum and the Information Commissioner’s Office, into data analytics for political purposes - and one in the US, as part of special counsel Robert Mueller’s probe into Trump-Russia collusion. The Observer revealed this week that the company had harvested millions of Facebook profiles of US voters, in one of the tech giant’s biggest ever data breaches, and used them to build a powerful software program to predict and influence choices at the ballot box. Emma Graham-Harrison Photograph: The Washington Post

“In 2015 we learned that a psychology professor at the University of Cambridge named Dr Aleksandr Kogan lied to us and violated our ‘platform policies’ by passing data from an app that was using Facebook login to SCL/Cambridge Analytica,” the statement said.

Facebook and Cambridge Analytica face mounting pressure over data scandal Read more

Donald Trump’s 2016 presidential election campaign paid Cambridge Analytica more than $6.2m, according to US federal election commission records but is has denied using any Facebook data in the campaign.

In a now-deleted series of tweets, Facebook’s chief security officer, Alex Stamos, argued that the friend list data that Cambridge Analytica had acquired was obtained through an API, a feature that allows programs to interface with Facebook, that was well documented “in our terms of service, platform documentation, the privacy settings and the screen used to login to apps”.

On Saturday evening, however, he deleted his tweets, saying: “I should have done a better job weighing in. There are a lot of big problems that the big tech companies need to be better at fixing,. We have collectively been too optimistic about what we build and our impact on the world. Believe it or not, a lot of the people at these companies, from the interns to the CEOs, agree.”


It is easy to be misled into believing that the Cambridge Analytica story is about rogue data miners taking advantage of an innocent Facebook. Facebook’s decision to suspend Cambridge Analytica’s access, the use of terms like “data breach”, and a good deal of coverage in the media seems to follow these lines. That, however, misses the key point. This is not a data breach by any means – and nor is it something that could not have been predicted or could easily have been avoided. This is, in many ways, Cambridge Analytica using Facebook exactly as the social media platform was designed to be used. This is how Facebook works.

Three key parts of Facebook’s model come into play: gathering data from people in order to profile them, both en masse and individually, designing systems that allow that data to be used to target people for advertising and content, then allowing third parties (generally advertisers) to use the data and those targeting systems for their own purposes. The power of these systems is often underestimated, but Facebook themselves know it, and have tested it in a number of ways.

They have demonstrated, through their “emotional contagion” experiment in 2014, that they can make people happier or sadder, simply by manipulating the order things appear in people’s timelines. They have demonstrated that they can make people more likely to vote, testing it in the 2010 US congressional elections. They can profile people based on the most mundane of information – the sheer scale of Facebook’s user-base and the amount of information given to them means that “big data” analysis can make connections that might seem bizarre, revealing insights into intelligence, politics, ethnicity and religion without people actually discussing any of those things directly.

They allow advertisements to be targeted to particular “racial affinity” groups – or tailored according to “racial affinity”. Not actual race, because that might conflict with various laws, but the race that your profile suggests you have the most “affinity” towards. Racial profiling without the name.

Cambridge Analytica : Chris Wylie tells Channel 4 News data for 50 million Facebook profiles was obtained

This all seems relatively harmless when it is just restricted to advertising for products – it might be a bit creepy to find carefully targeted advertisements for holidays in places you like or musicians you admire – but a few changes in parameters for targeting change things completely. The benefits of this profiling for electoral purposes are huge. Profiling the electorate has long been a part of political campaigning, but this makes it much more detailed, much more surreptitious, and much more effective.

Parties can target and influence voters directly; make their own supporters happier and opponents’ supporters sadder; make their own more likely to vote. They can spread stories tailored to individuals’ views, focussing on the aspects of a campaign that they know the individual cares about – both positively and negatively. When you add “fake news” to this, the situation becomes even worse.

That is the real point here. When thought about in terms of profiling and micro-targeting advertising for products, this just sounds efficient and appropriate, and harmless. It is a tiny shift, however to take this into politics – and a shift that groups like Cambridge Analytica found easy to do. All they did was understand how Facebook works, and use it. On a big scale, and in a big way, but this is how Facebook works. Profiling, targeting, persuasive manipulation are the tools of the advertiser on steroids, provably effective and available to those with the money and intent to use them. Unless Facebook changes its entire business model, it will be used in ways that interfere with our lives – and in particular that interferes with our politics.

What is more, it is only going to get more effective. Facebook is gathering more data all the time – including through its associated operations in Instagram, WhatsApp and more. Its analyses are being refined and becoming more effective all the time – and more people like Cambridge Analytica are becoming aware of the possibilities and how they might be used.

How it might be stopped, or at least slowed down, is another matter. This is all based on the fundamental operations of Facebook, so while we rely on Facebook, it is hard to see a way out. By choosing to let ourselves become dependent, we have built this trap for ourselves. Until we find our way out of Facebook, more of this is inevitable.

Paul Bernal is Senior Lecturer in IT, IP and Media Law, UEA Law School


How did Cambridge Analytica get 50 million people's data?

With a little outside help. To understand the story, we need to rewind to 2014 when Aleksandr Kogan -- a psychology researcher at Cambridge University -- created a Facebook app called "thisisyourdigitallife" with a personality test that spit out some kind of personal prediction at the end. To build credibility, the app was originally labeled a "research app used by psychologists," but that was only part of the truth -- as it turned out, Cambridge Analytica covered more than $800,000 of Kogan's app development costs. (Kogan also got to keep a copy of the resulting data for his trouble.) Some US Facebook users took the personality test as a result of ads on services like Amazon's Mechanical Turk and were paid for their efforts, but it's unclear how many chose to take the test on their own.

All told, some 270,000 US users took the test, but CA obviously walked away with data on many more people than that. That's thanks to a very specific Facebook peculiarity.

If you have a Facebook account, you've almost certainly used Facebook Login before -- it lets you create an account with a third-party app or service (or log into an existing account) with your Facebook credentials. It's incredibly convenient, but by using Facebook Login, you're tacitly giving developers of Facebook apps access to certain kinds of information about yourself -- email address and public profile data, for instance, are available to developers by default.

In 2014, however, using Facebook Login didn't just mean you were offering up your own data -- some data about the people in your social network was up for grabs too. (Facebook later deprecated the API that let this happen because, well, it's just creepy.) Those thousands of people who logged in to Kogan's app and took the test might have gotten the personality predictions they were looking for, but they paid for them with information about their friends and loved ones. Whether those results were ultimately valuable is another story. Kogan himself later said in an email to Cambridge coworkers (recently obtained by CNN) that he had provided "predicted personality scores" of 30 million American users to CA's parent company, but the results were "6 times more likely to get all 5 of a person's personality traits wrong as it was to get them all correct."

Was this really a data breach?

For better or worse, no. Facebook's official line is that calling this a breach is "completely false," since the people who signed up for Kogan's app did so willingly. As a result, that the information gained through those app logins was obtained within the scope of Facebook's guidelines. In other words, despite how shady all of this seems, the system worked exactly the way it was supposed to. The breakdown happened later when Kogan broke Facebook's rules and provided that information to Cambridge Analytica.

What has Facebook done about all this?

When all of this went down, very little -- in public, anyway. In a statement in its online newsroom, Facebook admits that it learned about Kogan and Cambridge Analytica's "violation" in 2015 and "demanded certifications from Kogan and all parties he had given data to that the information had been destroyed." As it turns out, some of that personal data might not have been deleted after all -- Facebook says it is "aggressively" trying to determine whether that's true.

More troubling is the fact that, as noted by Guardian reporter Carole Cadwalladr in an interview with CBS, Facebook never contacted any of the users involved. (She also added that Facebook took threatened to sue The Guardian to prevent an exposé from being published, which obviously isn't a good look.) Facebook VR/AR VP Andrew "Boz" Bosworth posted a rough timeline of the events (along with answers to certain FB-centric questions) earlier today, and it seems likely that timeline will remain a point of focus as investigations continue.

Finally, on March 16, a day before many of the biggest Cambridge Analytica stories broke, Facebook suspended accounts belonging to CA and its parent firm. The move is widely read as an attempt on Facebook's part to clean up some of the mess before The Guardian and The New York Times ran their most damning reports. Then, in a somewhat unexpected move, Facebook also disabled Christopher Wylie's account and prevented him from using Whatsapp, the popular messaging app Facebook acquired in 2014. (Consider this a brief reminder of how much of your social world Facebook currently owns.)

Beyond that, some Facebook execs spent the weekend asserting that there was no actual data breach. Meanwhile, CEO Mark Zuckerberg hasn't said anything about the unfolding situation, though we can imagine his silence can't last for too much longer.

So what happens now?

Scrutiny, and lots of it. Now that all of this is out in the open, powerful people are taking an interest. On March 18, Senator Amy Klobuchar (D-Minnesota) tweeted that Facebook CEO Mark Zuckerberg should testify in front of the Senate Judiciary Committee, adding that "it's clear these platforms can't police themselves." Senator Ron Wyden (D-Oregon) sent a letter (PDF) to Zuckerberg the following day, requesting information like the number of times similar incidents have occurred within the past ten years and whether Facebook has ever notified "individual users about inappropriate collection, retention or subsequent use of their data by third parties."

Facebook breach: This is a major breach that must be investigated. It's clear these platforms can't police themselves. I've called for more transparency & accountability for online political ads. They say "trust us." Mark Zuckerberg needs to testify before Senate Judiciary. — Amy Klobuchar (@amyklobuchar) March 17, 2018

Meanwhile, in the UK, Cambridge Analytica CEO Alexander Nix recently told the Parliament's Digital, Culture, Media and Sport Committee that the company did not collect people's personal information through Facebook without consent. Since Nix lied, Committee chairman Damian Collins has accused the CEO of peddling false statements and has called CA whistleblower Christopher Wylie to offer evidence to parliament. Even more promising, UK Information Commissioner Elizabeth Denham confirmed that she is seeking a warrant to access Cambridge Analytica's servers.

BREAKING: Damian Collins, chair of UK parliament's news inquiry, has called Cambridge Analytica whistleblower @chrisinsilico to give evidence next week to parliament. I predict: fireworks. — Carole Cadwalladr (@carolecadwalla) March 19, 2018

And as far as Facebook is concerned, there are few people more powerful than its shareholders. We wouldn't be surprised to see Facebook's financials bounce around for a while, too -- as I write this, the company's share price is down nearly 7 percent, shaving billions of dollars off Facebook's market cap (and eating away at Zuckerberg's net worth.) The entire core of its business is built on a foundation of trust with its users, and incidents like this can do serious damage to that trust.

This is scary -- should I keep using Facebook?

Honestly, you should at least give serious consideration to deleting your account. If you're a Facebook user, then you and all of your Facebooking friends are collectively the single most valuable thing the company has. Its fortunes rise and fall when its user numbers ebb and flow. The old internet adage says "if you're not paying, you're the product," and this is a perfect example of that.

The data we offer Facebook freely is a commodity to be accessed, mashed up, scraped and targeted against. For some, the value of the platform is enough to override the dangers. There's nothing wrong with that, and it's worth taking a few minutes to dig into account's privacy, app and ad settings to limit the amount of data you unknowingly offer to the machine. But there's also nothing wrong with saying enough is enough. None of us will cough up a cent to get bombarded by fake news, inane quizzes and game requests. But that doesn't mean we aren't paying for Facebook.

Total comment

Author

fw

0   comments

Cancel Reply