Contact Form

 

Cambridge Analytica researcher touted data-mining in Russia speech


Image copyright Getty Images

It's a sensational story containing allegations of sleaze, psychological manipulation and data misuse that has provoked an internationally furious response.

Tech giant Facebook and data analytics firm Cambridge Analytica are at the centre of a dispute over the harvesting and use of personal data - and whether it was used to influence the outcome of the US 2016 presidential election or the UK Brexit referendum.

Both firms deny any wrongdoing.

How has Cambridge Analytica been accused of sleazy tactics?

Channel 4 News sent an undercover reporter to meet executives from data analytics firm Cambridge Analytica, following reports by the journalist Carole Cadwalladr in the Observer newspaper.

The firm had been credited with helping Donald Trump to presidential victory.

The reporter posed as a Sri Lankan businessman wanting to influence a local election.

Cambridge Analytica boss Alexander Nix was apparently filmed giving examples of how his firm could discredit political rivals by arranging various smear campaigns, including setting up encounters with prostitutes and staging situations in which apparent bribery could be caught on camera.

Media playback is unsupported on your device Media caption Alexander Nix, CEO, Cambridge Analytica: "These sort of tactics are very effective"

The firm denies all the claims and says the documentary was "edited and scripted to grossly represent the nature of those conversations". It claims the conversations were led by the reporters.

"I must emphatically state that Cambridge Analytica does not condone or engage in entrapment, bribes or so-called 'honeytraps', and nor does it use untrue material for any purpose," said Mr Nix.

What was Facebook's role?

In 2014 a quiz on Facebook invited users to find out their personality type.

Image copyright Getty Images

It was developed by University of Cambridge academic Aleksandr Kogan (the university has no connections with Cambridge Analytica).

As was common with apps and games at that time, it was designed to harvest not only the user data of the person taking part in the quiz, but also the data of their friends.

Facebook has since changed the amount of data developers can scrape in this way.

Christopher Wylie, who worked with Cambridge Analytica, alleges that because 270,000 people took the quiz, the data of some 50 million users, mainly in the US, was harvested without their explicit consent via their friend networks.

Mr Wylie claims the data was sold to Cambridge Analytica, which then used it to psychologically profile people and deliver pro-Trump material to them.

Cambridge Analytica denies any of it was used as part of the services it provided to the Trump campaign.

Is this against Facebook's terms?

The data was gathered using Facebook's infrastructure at that time, and many other developers had taken advantage of it - but the data was not authorised for them to share with others.

The other key point is that even the people directly taking part in the personality quiz would have had no idea that they were potentially sharing their data with Donald Trump's election campaign.

Facebook say when they learned their rules had been breached, they removed the app and demanded assurances that the information had been deleted.

Cambridge Analytica claims that it never used the data, and deleted it when Facebook told it to.

Both Facebook and the UK Information Commissioner want to find out whether it was properly destroyed, as Mr Wylie claims it was not.

What has the official response been?

Image copyright Getty Images Image caption There are calls for Facebook founder Mark Zuckerberg to testify before Congress

US senators have called on Mark Zuckerberg to testify before Congress about how Facebook will protect users.

The head of the European Parliament said it would investigate to see if the data was misused.

A spokesman for Prime Minister Theresa May said she was "very concerned" about the revelations.

How can you protect your data?

There are a few things to be aware of if you want to restrict who has access to your data.

Keep an eye on apps, especially those which require you to log in using your Facebook account - they often have a very wide range of permissions and many are specifically designed to pick up your data

Use an ad blocker to limit advertising

Look at your Facebook security settings and make sure you are aware of what is enabled. Check the individual app settings to see whether you have given them permission to view your friends as well as yourself.

You can download a copy of the data Facebook holds on you, although it is not comprehensive. There is a download button at the bottom of the General Account Settings tab. However bear in mind that your data may be less secure sitting on your laptop than it is on Facebook's servers, if your device is hacked.

You can of course, simply leave Facebook, but the campaign group Privacy International warns that privacy concerns extend beyond the social network.

"The current focus is on protecting your data being exploited by third parties, but your data is being exploited all the time," said a spokeswoman.

"Many apps on your phone will have permission to access location data, your entire phone book and so on. It is just the tip of the iceberg."


The researcher at the center of the Cambridge Analytica data-mining scandal touted controversial techniques at a lecture four years ago in Russia, despite downplaying the methods in an email to colleagues after his work attracted international scrutiny.

The Soviet-born researcher, Aleksandr Kogan, started working with Cambridge Analytica in 2014. That same year, he teamed up with students and researchers from St. Petersburg State University, one of the top schools in Russia, to pursue a data-harvesting project similar to the one that produced the data he sold to Cambridge Analytica.

Kogan provided data on tens of millions of Americans to SCL Group, the parent company of Cambridge Analytica, which worked on the Trump campaign.

Facebook has accused him of violating platform policies by passing Facebook user data he gained through an app on to third-parties, including Cambridge Analytica. The tech giant also said Kogan lied to them about the true nature of his work by claiming the app was for academic research and not commercial use, something Kogan denies.

For his partnership with the university, Kogan traveled to Russia three times. In lectures to students, according to videos reviewed by CNN, he painted a different picture than what he told colleagues in a recent email in which he downplayed his work.

Related: Exclusive: Scientist at center of Facebook-Cambridge Analytica controversy speaks

In the email, Kogan said he gave a few lectures at St. Petersburg State University "on how social media data CANNOT be used effectively to make individual-level predictions." He also said in the email that the predictions he gave SCL were just as likely to be wrong as they were to be right.

After this story was first published, Kogan said in an email to CNN, "Back in 2014, I gave quite a few talks about big data methods and what they are useful for. In these talks, I discussed both my own research and that of others. When speaking about the accuracy of the results--like understanding people better than their friends and family understand you--I was quoting research done by others in this space. But over the course of the next year, as I started to personally delve into the issue of prediction accuracy, I found the claims to have been grossly inflated. What we found ourselves was that the data isn't very accurate at the individual level at all. And so my talks in subsequent years started to reflect this. In fact, my 2016 talk in Russia talked about how I found that for any complex trait (like personality), we couldn't predict people well at all."

Kogan used to speak in glowing terms about the power of data from Facebook. He told the students in one May 2014 lecture, delivered in Russian, that data gleaned from social media unlocks key insights.

"The level of what can be predicted about you based on what you like on Facebook is higher than what your wife would say about you, what your parents or friends can say about you," Kogan said. "Even if we take your 10 best friends and they all give a description of who you are as a person and we combine it all together — this analysis method is still better."

"Your Facebook knows more about you than any other person in your life," he added.

Kogan told the students that "basically everything" about someone could be predicted from Facebook "likes" alone, including someone's intelligence, personality and well-being.

He said the data could be captured at a low price — "quick and cheap, this almost didn't cost anything" — and that such methods had the potential for major commercial benefits.

The research conducted in Russia, in association with Kogan, included informed consent requirements that are standard for academic research. In the lecture, Kogan described how his app pulled information about likes, friends, ages, demographics — and more private details.

"It's messaging... this is private information, which no one sees," Kogan said. "You can also load all of that. We usually load 3,000 (messages) per person. And there they talk about everything."

It's not clear whether Kogan was boasting to eager students or if he actually accessed and saved private messages of some Facebook users.

"Four years ago, our Platform policy permitted people to share their own message inbox with developers," a Facebook spokesperson told CNN on Tuesday. "...Developer access to messages was enabled only if the person downloaded the app and explicitly approved the permission."

That feature was phased out in 2015, the spokesperson said.

In the email to his colleagues, Kogan decried recent reports about his work in Russia, and denied being a "Russian spy." He said his position at St. Petersburg State University was "mostly an honorary role" and that he only visited the school three times.

A person familiar with Kogan's work corroborated parts of his email, explaining that the St. Petersburg students saw Kogan as a "token Westerner" who could help them secure funding for their research. It's more likely projects get approval with backing from Western academics, the person noted.

Previous reports suggested that Kogan got a grant directly from the government for his project. But the university is given a pool of money from Russia's federal budget, and the school then decides which specific research projects get funded, the person familiar explained.

"At some point, he confused the role of academic researcher and businessman," the person said of Kogan's work with Cambridge Analytica. "Maybe he was too young and did something stupid with his data. He didn't come across as this evil, scheming individual."

The techniques Kogan used to harvest Americans' data for Cambridge Analytica are drawing scrutiny on both sides of the Atlantic. But they were not a secret while they happened: The Russian research team published an academic abstract online in March 2015, detailing the methods.

The press office for St. Petersburg State University did not have any comment on Tuesday.


It’s exactly the sort of conversation about politics that one would hope did not exist.

Two suits, in a swanky restaurant, blithely boasting about exploiting fears buried so deep inside our subconscious that most of us don’t even know we have them; claiming that, for the right price, they can creep invisibly into your head.

Cambridge Analytica boasts of dirty tricks to swing elections Read more

Channel 4’s sting, which showed executives at the digital marketing firm Cambridge Analytica telling undercover reporters what on earth it is they actually do, was just the final piece of this particular jigsaw. My indefatigable Observer colleague Carole Cadwalladr put in the hard yards, over months of investigating the firm that boasts of using a combination of data and behavioural science to laser-target voters and thus help put Donald Trump in the White House.

But until now it’s been difficult for many people to visualise how the unauthorised use of our personal data, or the use of social media profiles to manipulate our votes, looks in practice. It sounds bad, obviously. We clearly should care. But it’s all so complicated, and life is busy. The risks of letting tech giants plough through our holiday snaps seem abstract and remote compared with the instant gratification of social media likes and shares and gossip.

Well, it’s not so remote now. Facebook had $36bn wiped off its shares on Monday following Cadwalladr’s revelation that personal data from 50 million American Facebook users, obtained by an academic using privileged access granted for research purposes, was then passed on to Cambridge Analytica.

And now we know what sort of hands it ended up in. When approached by undercover reporters, posing as wealthy clients seeking to get chosen politicians elected in Sri Lanka, Cambridge Analytica executives suggested all sorts of dubious miracles might be possible. Maybe a rival could be made a financial offer he couldn’t refuse, with the resulting incriminating film posted online. Or maybe some beautiful women could be sent to his house.

But arguably more chilling was a conversation about just how deep its psychological profiling goes. Christopher Wylie, the whistleblower Cadwalladr worked with, has said the company used data harvested from prospective voters to “build models to exploit what we knew about them, and target their inner demons”. On camera we saw what that might mean: talk of operating subliminally, exploiting fears where “you didn’t know that was a fear until you saw something that just evoked that reaction”.

Cambridge Analytica’s chief executive, Alexander Nix, one of those caught on camera, now insists his company was indulging in “a certain amount of hyperbole” – either exaggerating to impress the client, or perhaps even quietly probing their intentions. Like lobbyists, his company seems to spend half its time telling clients of its power to influence elections and the other half telling journalists those powers are wildly overstated.

Data mining is a fast-growing business operating largely unseen on the fringe of politics

But at best, his company looks guilty of hype, and at worst, the months of legal threats and denials over Cadwalladr’s stories – not to mention the online abuse she got from people such as Arron Banks of the unofficial pro-Brexit campaign Leave.EU, said to have consulted Cambridge Analytica – stink.

None of this means Britain would have voted remain or Hillary Clinton would be president if it hadn’t been for these pesky kids, just as “corporate lobbying” can’t explain every poor government decision. Voters’ anger was real, and this scandal doesn’t absolve us from asking the hard questions about why so many responded to extreme messages. Since Cambridge Analytica’s business model is arguably just a supercharged version of something political parties have done for years – identifying potential supporters, compiling detailed pictures of what makes them tick, then tailor-making messages to different groups depending on what they want to hear – it may also turn out that all sides have been busily scraping data behind our backs.

But this feels like a tipping point. Britain’s information commissioner has a warrant to search servers. The Commons select committee inquiry into fake news will recall Nix, questioning whether he “deliberately misled” them in recent testimony on the use of the Facebook data, and seek fresh evidence from Facebook. The latter’s falling share price meanwhile reflects not just this scandal but a string of them, including its role in inadvertently spreading fake news. There is a growing sense that even if users don’t take fright, regulators are losing patience.

And so they should. Like lobbying back in the days of cash-for-questions, data mining is a fast-growing business operating largely unseen on the fringe of politics, and while it can be used to respectable ends, it’s vulnerable to abuse. It clearly has the capacity to undermine our democratic process, even if it hasn’t done so yet. We should act before it’s too late.

• Gaby Hinsliff is a Guardian columnist and former political editor of the Observer


It is easy to be misled into believing that the Cambridge Analytica story is about rogue data miners taking advantage of an innocent Facebook. Facebook’s decision to suspend Cambridge Analytica’s access, the use of terms like “data breach”, and a good deal of coverage in the media seems to follow these lines. That, however, misses the key point. This is not a data breach by any means – and nor is it something that could not have been predicted or could easily have been avoided. This is, in many ways, Cambridge Analytica using Facebook exactly as the social media platform was designed to be used. This is how Facebook works.

Three key parts of Facebook’s model come into play: gathering data from people in order to profile them, both en masse and individually, designing systems that allow that data to be used to target people for advertising and content, then allowing third parties (generally advertisers) to use the data and those targeting systems for their own purposes. The power of these systems is often underestimated, but Facebook themselves know it, and have tested it in a number of ways.

They have demonstrated, through their “emotional contagion” experiment in 2014, that they can make people happier or sadder, simply by manipulating the order things appear in people’s timelines. They have demonstrated that they can make people more likely to vote, testing it in the 2010 US congressional elections. They can profile people based on the most mundane of information – the sheer scale of Facebook’s user-base and the amount of information given to them means that “big data” analysis can make connections that might seem bizarre, revealing insights into intelligence, politics, ethnicity and religion without people actually discussing any of those things directly.

They allow advertisements to be targeted to particular “racial affinity” groups – or tailored according to “racial affinity”. Not actual race, because that might conflict with various laws, but the race that your profile suggests you have the most “affinity” towards. Racial profiling without the name.

Cambridge Analytica : Chris Wylie tells Channel 4 News data for 50 million Facebook profiles was obtained

This all seems relatively harmless when it is just restricted to advertising for products – it might be a bit creepy to find carefully targeted advertisements for holidays in places you like or musicians you admire – but a few changes in parameters for targeting change things completely. The benefits of this profiling for electoral purposes are huge. Profiling the electorate has long been a part of political campaigning, but this makes it much more detailed, much more surreptitious, and much more effective.

Parties can target and influence voters directly; make their own supporters happier and opponents’ supporters sadder; make their own more likely to vote. They can spread stories tailored to individuals’ views, focussing on the aspects of a campaign that they know the individual cares about – both positively and negatively. When you add “fake news” to this, the situation becomes even worse.

That is the real point here. When thought about in terms of profiling and micro-targeting advertising for products, this just sounds efficient and appropriate, and harmless. It is a tiny shift, however to take this into politics – and a shift that groups like Cambridge Analytica found easy to do. All they did was understand how Facebook works, and use it. On a big scale, and in a big way, but this is how Facebook works. Profiling, targeting, persuasive manipulation are the tools of the advertiser on steroids, provably effective and available to those with the money and intent to use them. Unless Facebook changes its entire business model, it will be used in ways that interfere with our lives – and in particular that interferes with our politics.

What is more, it is only going to get more effective. Facebook is gathering more data all the time – including through its associated operations in Instagram, WhatsApp and more. Its analyses are being refined and becoming more effective all the time – and more people like Cambridge Analytica are becoming aware of the possibilities and how they might be used.

How it might be stopped, or at least slowed down, is another matter. This is all based on the fundamental operations of Facebook, so while we rely on Facebook, it is hard to see a way out. By choosing to let ourselves become dependent, we have built this trap for ourselves. Until we find our way out of Facebook, more of this is inevitable.

Paul Bernal is Senior Lecturer in IT, IP and Media Law, UEA Law School

Total comment

Author

fw

0   comments

Cancel Reply