Member only
Episode
260

The Cambridge Analytica Scandal

May 6, 2022
Science & Technology
-
24
minutes

In March of 2018, it was revealed that 50 million Facebook profiles had been "hacked" and used to target American voters.

The story involved Donald Trump, a supposed Russian spy, a Cambridge researcher and a political consulting company.

The only problem was, how much of it was actually true?

Continue learning

Get immediate access to a more interesting way of improving your English
Become a member
Already a member? Login
Subtitles will start when you press 'play'
You need to subscribe for the full subtitles
Already a member? Login
Download transcript & key vocabulary pdf
Download transcript & key vocabulary pdf

Transcript

[00:00:00] Hello, hello hello, and welcome to English Learning for Curious Minds, by Leonardo English. 

[00:00:12] The show where you can listen to fascinating stories, and learn weird and wonderful things about the world at the same time as improving your English.

[00:00:22] I'm Alastair Budge, and today we are going to be talking about The Cambridge Analytica Scandal.

[00:00:29] It’s a story that brings together the power of Facebook, journalism, political prejudice, right vs. left, democracy, human psychology, free will, allegations of Russian spies working to influence foreign elections and a plot to put Donald Trump in the White House.

[00:00:49] The only problem is. 

[00:00:50] Is any of it even true? Right, let’s get started and talk about Cambridge Analytica.

[00:01:00] If you had bought the Guardian Newspaper on March 18th of 2018, on the front cover you would have seen a picture of a 28-year-old man with short, pink hair.

[00:01:14] Next to him, the all-powerful headline:

[00:01:18] “Revealed: 50 million Facebook files taken in record data breach.” 

[00:01:25] A breach, by the way, means a break, a hole that has been made in order to get in and access something.

[00:01:34] The story went on to reveal that a company called Cambridge Analytica had used the personal Facebook data of 50 million people to target US citizens, build a–and I’m quoting directly here–”psychological warfare tool” which targeted people based on their stolen data, and ultimately swing the election, to win the 2016 US presidential election for Donald Trump.

[00:02:04] It was a powerful story, and it was all over the news. 

[00:02:09] You may well remember the story yourself.

[00:02:12] It created big problems for Facebook, and the company’s share price. It lost $35 billion in value within a day, and in the subsequent months it lost over $100 billion in stock market value.

[00:02:29] The company behind the supposed data breach, Cambridge Analytica, went out of business, and its former CEO was forced to testify in front of the British parliament.

[00:02:43] The journalist behind the story, Carole Cadwalladr, was invited to give a TED talk, and won the prestigious George Orwell prize for journalism for breaking this story.

[00:02:56] So, what we are going to try to answer in this episode is what actually happened? 

[00:03:03] Is democracy really at threat from bad actors using personal data to influence our voting behaviour? 

[00:03:11] Or is the entire story completely overblown

[00:03:15] So, let’s start with what happened. 

[00:03:19] The main character in the story, without whom none of this would have been possible, was a man called Aleksandr Kogan. 

[00:03:27] He was born in the USSR, the former Soviet Union, but moved to the United States when he was 7 years old.

[00:03:35] Kogan excelled at school, and showed a remarkable talent for mathematics and physics. 

[00:03:43] At university he became increasingly interested in psychology, in understanding why humans feel certain emotions, and what makes us behave in the way we do.

[00:03:56] He had been working at Cambridge University since 2012, and saw from early on the potential power that a then newish technology company, Facebook, had. If you used Facebook back in 2012, I imagine it played a bigger, or at least more obvious, role in your life than it does now. 

[00:04:20] You used that “like” button to like anything from a status update from a friend to a “raising money for dog shelters” campaign. 

[00:04:29] You probably didn’t think too much about what happened after you clicked that “Like” button. After all, Facebook was a fun way to keep in touch with friends and family, share photos and generally see what was going on in the world. 

[00:04:46] To Kogan, however, it seemed like the most incredible dataset on humanity. 

[00:04:53] People like you and me were spending all this time on Facebook, liking stuff, interacting with it, giving Facebook data on what we like and what we don’t like, what we engage with and what we aren’t interested in.

[00:05:08] What’s more, Facebook could see who all of our friends were, so it had this amazing understanding of connections between all of its users.

[00:05:18] In 2014, Facebook passed 1.3 billion users, each of whom was spending an average of 40 minutes a day on the network. 

[00:05:30] So not only did it have a vast amount of people using it, but they were using it a lot.

[00:05:37] To someone like Kogan, this was fascinating.

[00:05:41] What’s more, at the time, Facebook had recently started to allow third-parties to develop applications on top of Facebook that would give them certain permissions to view the data on Facebook users.

[00:05:56] If you can think back to this time, if you were a Facebook user that is, you might remember a load of quizzes where you granted the quiz access to your Facebook account, answered a few questions, and it would tell you things like what Harry Potter character you were like or what member of the Beatles you would have been.

[00:06:16] Silly stuff, harmless fun. Or so most people thought.

[00:06:21] Indeed, back in 2014 these third-party applications were able to access some data not just about you but also about your friends.

[00:06:32] Someone might give access to a quiz to find out what type of pizza they are, agreeing to share their data with the app, but not realise that they were actually allowing the application to access information about their friends as well.

[00:06:49] Kogan’s question was, was any of this Facebook data actually any good at telling you anything useful about that person’s personality?

[00:06:59] If someone on Facebook liked “dogs, Mars bars and reading”, did this actually tell you anything about them as an individual?

[00:07:10] But Kogan was no thief, and he couldn’t just take this data from Facebook. Facebook users needed to give him their permission to use it. 

[00:07:20] And it’s here that we meet Cambridge Analytica. 

[00:07:24] Cambridge Analytica, which has no links whatsoever to Cambridge University, by the way, was a political consulting company founded in 2013.

[00:07:35] Its supposed speciality was advising political campaigns, advising politicians on how to use social media and data to put the right message in front of the right people at the right time.

[00:07:50] Cambridge Analytica paid Kogan to create an app that would collect data on Facebook users. 

[00:07:57] Cambridge Analytica would be able to use the data for use with its political clients, and Kogan could continue his academic research with the data he collected.

[00:08:08] A win-win situation, one where both parties benefited.

[00:08:13] So, Kogan, with Facebook’s permission, built a simple application called “This Is Your Digital Life”. 

[00:08:21] It was a quiz, essentially. 

[00:08:23] Users were offered three or four dollars to participate in the survey, and in exchange they gave the app permissions to access their Facebook data.

[00:08:35] What they probably didn’t realise, though, was that they were actually giving the app permission to see some data on all of their friends.

[00:08:43] So if I authorised the app, if I took the quiz, the app would get data on all of my friends. If I had 500 friends, for example, the app got the data on those 500 friends.

[00:08:58] The quiz was taken by around 300,000 people, each one with hundreds of friends, meaning that Kogan now had Facebook data on almost 90 million people. The initial Guardian report said 50 million, but Facebook was later forced to admit that it had actually been almost double this number.

[00:09:19] Getting this data wasn’t cheap, as the app had to incentivise people to take the personality test. Cambridge Analytica provided the financing for it, about a million dollars, believing that the data that would come out at the end of it would be significantly more valuable.

[00:09:38] Now, let’s just pause to address a couple of points first, before we move on with the story.

[00:09:44] Kogan had Facebook’s permission to create the quiz application. There’s no dispute about that.

[00:09:51] What is disputed is that Kogan says he was given permission to sell this data on to third-parties. Facebook says he wasn’t.

[00:10:01] Where it does become a little blurry, a little less clear, is about how the data on Facebook users was accessed, and the use of the term “breach”.

[00:10:12] As a reminder, a breach is when illegal access is provided to something.

[00:10:18] At the time, Facebook’s default settings meant that if you gave an app permission to your data, by default it could get data on all of your friends. 

[00:10:29] The Facebook users likely didn’t understand exactly what they were giving their permission for, but Kogan’s app was one of hundreds, thousands even, that was collecting very similar amounts of data.

[00:10:43] So Kogan wasn’t doing anything illegal, and indeed he was doing what practically every other Facebook app was doing. 

[00:10:52] Ok, so with those clarifications out of the way, let’s return to our story.

[00:10:58] You might be thinking, what is Kogan actually getting out of all of this?

[00:11:03] Was he really a Russian agent, was he a closet Trump supporter, did he want to wreak havoc in foreign elections and provide data that could be used to undermine democracy?

[00:11:15] That would make for a good story, but unfortunately it doesn’t seem to be true.

[00:11:21] He says that he was unaware that the data would be used for political targeting, and his interest in doing all of this was primarily from a research perspective.

[00:11:33] He wanted to know whether all of this data he was collecting on people - what they liked, where they lived, what they did, whether this data actually helped predict anything about their personalities.

[00:11:46] Remember, he is an amazingly talented scientist with an interest in human psychology. He wanted to know how human behaviour and personality can be predicted.

[00:11:58] And as for Cambridge Analytica, if Kogan’s work proved to be able to predict human behaviour and personality, it would be amazingly valuable. 

[00:12:08] Imagine that just by knowing that if someone liked Mars bars, dogs and reading they would be particularly receptive to a certain type of political message, and if someone liked singing, learning languages and had a birthday in December then they would be receptive to another type of political message.

[00:12:29] Armed with this information on people, and with Facebook, the ability to target over a billion people, and the vast majority of the voters in the United States, this would be an incredibly powerful weapon.

[00:12:44] But how could you actually figure out whether the data was useful or not?

[00:12:49] Well, that’s where the personality quiz came in. 

[00:12:54] After someone gave access to Kogan’s application, they needed to fill out the personality quiz. 

[00:13:01] Sure, you can say that people aren’t very good at assessing their own personality, but as there was no reason for someone to fill in a question incorrectly, and the test was completely randomised, this data should have been accurate.

[00:13:17] After the Facebook user had completed the quiz, Kogan’s algorithm looked at their answers and classified them by five different personality traits - such as if they were an extrovert or introvert, how open they were to new ideas, and so on.

[00:13:34] It was then time to see whether all the data he had collected from Facebook was actually any good at predicting these people’s personality.

[00:13:44] It was crunch time.

[00:13:46] Unfortunately, according to Kogan, the algorithms managed to correctly predict someone’s personality 1% of the time. That is 1%, 1 one out of 100 times, meaning that 99% of the time it got it wrong. 

[00:14:04] The Facebook data simply didn’t seem to be any good at all at predicting someone’s personality.

[00:14:11] Now, you might have thought that this would be the end of the story, with Cambridge Analytica cutting its losses as the data was junk.

[00:14:20] But really it was just the start.

[00:14:23] Cambridge Analytica seemed utterly uninterested in how accurate the data actually was.

[00:14:30] Instead, it continued to tell its clients that it had an incredibly valuable dataset that could be used for political targeting

[00:14:39] To be precise, it said it had up to 5,000 data points on over 220 million Americans.

[00:14:48] The company also had connections in high places. Its investors included the billionaire Republican donor, Robert Mercer, and the man who would later go on to become Donald Trump’s chief strategist, Steve Bannon.

[00:15:03] These powerful connections would give it a head start in the US political scene, and would later result in it working on the 2016 US presidential campaign.

[00:15:15] While it would be most famous for working with Donald Trump, its first client was actually his competitor, the Republican establishment candidate, Ted Cruz.

[00:15:26] Cambridge Analytica had managed to sell Cruz on the power of its data and analytics capabilities, but as you will know, Cruz was heavily beaten by Trump in the nomination to be the Republican candidate for president.

[00:15:42] Sure, you might think, data can’t compensate for an uninspiring candidate, and there was only a limited amount that Cambridge Analytica could have done for Cruz. 

[00:15:53] In any case, it didn’t work.

[00:15:56] But that didn’t stop the company from being employed by the eventually victorious anti-establishment Trump campaign in 2016.

[00:16:06] It later emerged that Cambridge Analytica had told potential clients that it had also consulted on the Brexit campaign, the campaign to leave the EU in June of 2016, which happened six months before Trump’s victory in November.

[00:16:23] And the CEO of Cambridge Analytica, a tall man named Alexander Nix with an almost Bond-villain style, would present at conferences and give interviews where he talked about the power of the company’s data and analytics capabilities, boasting about how many academics and PhDs it employed, and how it was able to harness the power of big data to change public opinion about anything.

[00:16:52] Then, in March of 2018, the scandal broke.

[00:16:57] It was front-page news in The Guardian and the New York Times, and was all over cable news around the world the very same day. 

[00:17:06] For everyone who wasn’t involved in the world of political campaigning, so for almost everyone, it was the first time they had heard the name Cambridge Analytica.

[00:17:16] And it was frightening. 

[00:17:18] Experts were invited to come on TV and talk about this threat to democracy.

[00:17:24] Could free will continue to exist if companies were able to predict and change our voting patterns without our knowledge?

[00:17:33] For many, it was the first time that they had thought about the power and possibilities of the data that Facebook had on them.

[00:17:42] The hashtag #DeleteFacebook was trending on Twitter, and Mark Zuckerberg was dragged to testify before the US Congress while his company was losing billions of dollars of value.

[00:17:55] It made for a wonderful story.

[00:17:58] To people who had questioned how a country could possibly have voted for Brexit or Donald Trump, it provided an explanation. 

[00:18:07] The people had been tricked, psychologically profiled and targeted with messages to influence their opinion.

[00:18:15] There was even the fact that Alexander Kogan was born in Russia, which to some was a sign that he was a Russian spy. Journalists called him up and asked him point-blank whether he was a Russian agent.

[00:18:29] Suddenly it was clear, and everything made sense.

[00:18:33] But the reality is that the scandal was a bit of a storm in a teacup, it was a scandal over something that didn’t really happen.

[00:18:43] Kogan, the man who collected the data and ran the statistical models, has said numerous times that the data was practically useless on a personal level, even when it was first collected.

[00:18:57] What’s more, it was collected in 2014 and would have been pretty out of date in 2016 anyway. Data like this typically has a shelf-life of a maximum of 12 months.

[00:19:11] Plus, the fact that the Cruz campaign and the Trump campaign both got rid of Cambridge Analytica suggests that the people who paid to use the data, and supposedly used it to target voters with hyper-targeted advertising and swing an election, didn’t find it useful at all. 

[00:19:31] And as far as the question of whether the data had any impact on the Brexit vote, there was an extensive enquiry by the UK Information Commissioner’s Office, essentially an arm of government, which found absolutely no evidence that Facebook data held by Cambridge Analytica had any influence on the Brexit vote.

[00:19:53] So, let’s recap quickly, as there are two slightly different questions:

[00:19:58] Firstly, was there an illegal breach of Facebook data?

[00:20:03] And secondly, did this data actually influence anything?

[00:20:08] To the first question, no, it wasn’t illegal, but it was probably misleading. The users who authorised this application probably didn’t know that they were also giving their friends’ data, or that it might be used for commercial purposes.

[00:20:25] As a result, Facebook has clarified its policies on this, and third-party applications don’t get access to your friends’ data, and they get access to much less data than they used to get.

[00:20:39] If you authorise a third-party app now on Facebook, it’s a lot clearer, and these apps don’t tend to get access to very much.

[00:20:49] And to our second question, did the data actually influence anything, although it makes for a much juicier and more interesting story if you think it does, the reality certainly appears to be “no”.

[00:21:04] Now, as a final point, just in case this is mistaken for some great defence of Facebook or Cambridge Analytica, it most definitely isn’t. 

[00:21:14] The idea that you have accidentally given access to your personality, what you like and dislike, who you are, the idea that some evil third-party knows it is scary. It’s not nice, and it’s definitely not nice to think that you might have let this third-party have access to your nearest and dearest.

[00:21:35] But the reality is that this data, at least in the case of the Cambridge Analytica scandal, is not nearly as useful as you might think it to be.

[00:21:45] Indeed, there is one theory that the Cambridge Analytica scandal was actually excellent news for Facebook, because it made the world think that the data Facebook had on its users was significantly more valuable than Kogan’s analysis suggested it actually was.

[00:22:03] Now, to go back to our example of someone who likes “dogs, Mars bars and reading”. 

[00:22:10] Well, I like “dogs, Mars bars and reading”. 

[00:22:13] Maybe you like “dogs, Mars bars and reading” too.

[00:22:17] Does this mean that we’re similar, and most importantly, do these three things mean that we are likely to have the same political beliefs, care about the same issues, and are likely to respond in the same way to a political advert? 

[00:22:32] It’s a possibility, but from what Kogan’s research suggests, it certainly seems pretty unlikely.

[00:22:40] This is, of course, not to say that the information Facebook holds on you or me is worthless and unusable

[00:22:47] Facebook knows a lot more about us than whether we like dogs, Mars bars and reading, and hundreds of millions of dollars are spent every single day advertising to users on Facebook because of what Facebook knows about them. 

[00:23:02] But can this data be easily stolen by a third party, or even a foreign agent, and used to manipulate democracy? 

[00:23:12] Well, with the example of Cambridge Analytica, it certainly seems that even when we think it has, it really hasn't. 

[00:23:22] OK then, that is it for today's episode on the Cambridge Analytica Scandal.

[00:23:28] I hope it's been an interesting one, that you've learnt something new, and if you remember this scandal from a few years ago, well perhaps it has put a different perspective on it.

[00:23:39] As always, I would love to know what you thought of this episode.

[00:23:42] Do you remember this news coming out? How did you feel when it did?

[00:23:47] If you are a Facebook user, or I should probably say Meta now, shouldn’t I, anyway, how did it affect your behaviour, if at all?

[00:23:56] I would love to know, so let’s get this discussion started.

[00:24:00] You can head right into our community forum, which is at community.leonardoenglish.com and get chatting away to other curious minds.

[00:24:09] You've been listening to English Learning for Curious Minds, by Leonardo English.

[00:24:14] I'm Alastair Budge, you stay safe, and I'll catch you in the next episode.

[END OF EPISODE]

Continue learning

Get immediate access to a more interesting way of improving your English
Become a member
Already a member? Login

[00:00:00] Hello, hello hello, and welcome to English Learning for Curious Minds, by Leonardo English. 

[00:00:12] The show where you can listen to fascinating stories, and learn weird and wonderful things about the world at the same time as improving your English.

[00:00:22] I'm Alastair Budge, and today we are going to be talking about The Cambridge Analytica Scandal.

[00:00:29] It’s a story that brings together the power of Facebook, journalism, political prejudice, right vs. left, democracy, human psychology, free will, allegations of Russian spies working to influence foreign elections and a plot to put Donald Trump in the White House.

[00:00:49] The only problem is. 

[00:00:50] Is any of it even true? Right, let’s get started and talk about Cambridge Analytica.

[00:01:00] If you had bought the Guardian Newspaper on March 18th of 2018, on the front cover you would have seen a picture of a 28-year-old man with short, pink hair.

[00:01:14] Next to him, the all-powerful headline:

[00:01:18] “Revealed: 50 million Facebook files taken in record data breach.” 

[00:01:25] A breach, by the way, means a break, a hole that has been made in order to get in and access something.

[00:01:34] The story went on to reveal that a company called Cambridge Analytica had used the personal Facebook data of 50 million people to target US citizens, build a–and I’m quoting directly here–”psychological warfare tool” which targeted people based on their stolen data, and ultimately swing the election, to win the 2016 US presidential election for Donald Trump.

[00:02:04] It was a powerful story, and it was all over the news. 

[00:02:09] You may well remember the story yourself.

[00:02:12] It created big problems for Facebook, and the company’s share price. It lost $35 billion in value within a day, and in the subsequent months it lost over $100 billion in stock market value.

[00:02:29] The company behind the supposed data breach, Cambridge Analytica, went out of business, and its former CEO was forced to testify in front of the British parliament.

[00:02:43] The journalist behind the story, Carole Cadwalladr, was invited to give a TED talk, and won the prestigious George Orwell prize for journalism for breaking this story.

[00:02:56] So, what we are going to try to answer in this episode is what actually happened? 

[00:03:03] Is democracy really at threat from bad actors using personal data to influence our voting behaviour? 

[00:03:11] Or is the entire story completely overblown

[00:03:15] So, let’s start with what happened. 

[00:03:19] The main character in the story, without whom none of this would have been possible, was a man called Aleksandr Kogan. 

[00:03:27] He was born in the USSR, the former Soviet Union, but moved to the United States when he was 7 years old.

[00:03:35] Kogan excelled at school, and showed a remarkable talent for mathematics and physics. 

[00:03:43] At university he became increasingly interested in psychology, in understanding why humans feel certain emotions, and what makes us behave in the way we do.

[00:03:56] He had been working at Cambridge University since 2012, and saw from early on the potential power that a then newish technology company, Facebook, had. If you used Facebook back in 2012, I imagine it played a bigger, or at least more obvious, role in your life than it does now. 

[00:04:20] You used that “like” button to like anything from a status update from a friend to a “raising money for dog shelters” campaign. 

[00:04:29] You probably didn’t think too much about what happened after you clicked that “Like” button. After all, Facebook was a fun way to keep in touch with friends and family, share photos and generally see what was going on in the world. 

[00:04:46] To Kogan, however, it seemed like the most incredible dataset on humanity. 

[00:04:53] People like you and me were spending all this time on Facebook, liking stuff, interacting with it, giving Facebook data on what we like and what we don’t like, what we engage with and what we aren’t interested in.

[00:05:08] What’s more, Facebook could see who all of our friends were, so it had this amazing understanding of connections between all of its users.

[00:05:18] In 2014, Facebook passed 1.3 billion users, each of whom was spending an average of 40 minutes a day on the network. 

[00:05:30] So not only did it have a vast amount of people using it, but they were using it a lot.

[00:05:37] To someone like Kogan, this was fascinating.

[00:05:41] What’s more, at the time, Facebook had recently started to allow third-parties to develop applications on top of Facebook that would give them certain permissions to view the data on Facebook users.

[00:05:56] If you can think back to this time, if you were a Facebook user that is, you might remember a load of quizzes where you granted the quiz access to your Facebook account, answered a few questions, and it would tell you things like what Harry Potter character you were like or what member of the Beatles you would have been.

[00:06:16] Silly stuff, harmless fun. Or so most people thought.

[00:06:21] Indeed, back in 2014 these third-party applications were able to access some data not just about you but also about your friends.

[00:06:32] Someone might give access to a quiz to find out what type of pizza they are, agreeing to share their data with the app, but not realise that they were actually allowing the application to access information about their friends as well.

[00:06:49] Kogan’s question was, was any of this Facebook data actually any good at telling you anything useful about that person’s personality?

[00:06:59] If someone on Facebook liked “dogs, Mars bars and reading”, did this actually tell you anything about them as an individual?

[00:07:10] But Kogan was no thief, and he couldn’t just take this data from Facebook. Facebook users needed to give him their permission to use it. 

[00:07:20] And it’s here that we meet Cambridge Analytica. 

[00:07:24] Cambridge Analytica, which has no links whatsoever to Cambridge University, by the way, was a political consulting company founded in 2013.

[00:07:35] Its supposed speciality was advising political campaigns, advising politicians on how to use social media and data to put the right message in front of the right people at the right time.

[00:07:50] Cambridge Analytica paid Kogan to create an app that would collect data on Facebook users. 

[00:07:57] Cambridge Analytica would be able to use the data for use with its political clients, and Kogan could continue his academic research with the data he collected.

[00:08:08] A win-win situation, one where both parties benefited.

[00:08:13] So, Kogan, with Facebook’s permission, built a simple application called “This Is Your Digital Life”. 

[00:08:21] It was a quiz, essentially. 

[00:08:23] Users were offered three or four dollars to participate in the survey, and in exchange they gave the app permissions to access their Facebook data.

[00:08:35] What they probably didn’t realise, though, was that they were actually giving the app permission to see some data on all of their friends.

[00:08:43] So if I authorised the app, if I took the quiz, the app would get data on all of my friends. If I had 500 friends, for example, the app got the data on those 500 friends.

[00:08:58] The quiz was taken by around 300,000 people, each one with hundreds of friends, meaning that Kogan now had Facebook data on almost 90 million people. The initial Guardian report said 50 million, but Facebook was later forced to admit that it had actually been almost double this number.

[00:09:19] Getting this data wasn’t cheap, as the app had to incentivise people to take the personality test. Cambridge Analytica provided the financing for it, about a million dollars, believing that the data that would come out at the end of it would be significantly more valuable.

[00:09:38] Now, let’s just pause to address a couple of points first, before we move on with the story.

[00:09:44] Kogan had Facebook’s permission to create the quiz application. There’s no dispute about that.

[00:09:51] What is disputed is that Kogan says he was given permission to sell this data on to third-parties. Facebook says he wasn’t.

[00:10:01] Where it does become a little blurry, a little less clear, is about how the data on Facebook users was accessed, and the use of the term “breach”.

[00:10:12] As a reminder, a breach is when illegal access is provided to something.

[00:10:18] At the time, Facebook’s default settings meant that if you gave an app permission to your data, by default it could get data on all of your friends. 

[00:10:29] The Facebook users likely didn’t understand exactly what they were giving their permission for, but Kogan’s app was one of hundreds, thousands even, that was collecting very similar amounts of data.

[00:10:43] So Kogan wasn’t doing anything illegal, and indeed he was doing what practically every other Facebook app was doing. 

[00:10:52] Ok, so with those clarifications out of the way, let’s return to our story.

[00:10:58] You might be thinking, what is Kogan actually getting out of all of this?

[00:11:03] Was he really a Russian agent, was he a closet Trump supporter, did he want to wreak havoc in foreign elections and provide data that could be used to undermine democracy?

[00:11:15] That would make for a good story, but unfortunately it doesn’t seem to be true.

[00:11:21] He says that he was unaware that the data would be used for political targeting, and his interest in doing all of this was primarily from a research perspective.

[00:11:33] He wanted to know whether all of this data he was collecting on people - what they liked, where they lived, what they did, whether this data actually helped predict anything about their personalities.

[00:11:46] Remember, he is an amazingly talented scientist with an interest in human psychology. He wanted to know how human behaviour and personality can be predicted.

[00:11:58] And as for Cambridge Analytica, if Kogan’s work proved to be able to predict human behaviour and personality, it would be amazingly valuable. 

[00:12:08] Imagine that just by knowing that if someone liked Mars bars, dogs and reading they would be particularly receptive to a certain type of political message, and if someone liked singing, learning languages and had a birthday in December then they would be receptive to another type of political message.

[00:12:29] Armed with this information on people, and with Facebook, the ability to target over a billion people, and the vast majority of the voters in the United States, this would be an incredibly powerful weapon.

[00:12:44] But how could you actually figure out whether the data was useful or not?

[00:12:49] Well, that’s where the personality quiz came in. 

[00:12:54] After someone gave access to Kogan’s application, they needed to fill out the personality quiz. 

[00:13:01] Sure, you can say that people aren’t very good at assessing their own personality, but as there was no reason for someone to fill in a question incorrectly, and the test was completely randomised, this data should have been accurate.

[00:13:17] After the Facebook user had completed the quiz, Kogan’s algorithm looked at their answers and classified them by five different personality traits - such as if they were an extrovert or introvert, how open they were to new ideas, and so on.

[00:13:34] It was then time to see whether all the data he had collected from Facebook was actually any good at predicting these people’s personality.

[00:13:44] It was crunch time.

[00:13:46] Unfortunately, according to Kogan, the algorithms managed to correctly predict someone’s personality 1% of the time. That is 1%, 1 one out of 100 times, meaning that 99% of the time it got it wrong. 

[00:14:04] The Facebook data simply didn’t seem to be any good at all at predicting someone’s personality.

[00:14:11] Now, you might have thought that this would be the end of the story, with Cambridge Analytica cutting its losses as the data was junk.

[00:14:20] But really it was just the start.

[00:14:23] Cambridge Analytica seemed utterly uninterested in how accurate the data actually was.

[00:14:30] Instead, it continued to tell its clients that it had an incredibly valuable dataset that could be used for political targeting

[00:14:39] To be precise, it said it had up to 5,000 data points on over 220 million Americans.

[00:14:48] The company also had connections in high places. Its investors included the billionaire Republican donor, Robert Mercer, and the man who would later go on to become Donald Trump’s chief strategist, Steve Bannon.

[00:15:03] These powerful connections would give it a head start in the US political scene, and would later result in it working on the 2016 US presidential campaign.

[00:15:15] While it would be most famous for working with Donald Trump, its first client was actually his competitor, the Republican establishment candidate, Ted Cruz.

[00:15:26] Cambridge Analytica had managed to sell Cruz on the power of its data and analytics capabilities, but as you will know, Cruz was heavily beaten by Trump in the nomination to be the Republican candidate for president.

[00:15:42] Sure, you might think, data can’t compensate for an uninspiring candidate, and there was only a limited amount that Cambridge Analytica could have done for Cruz. 

[00:15:53] In any case, it didn’t work.

[00:15:56] But that didn’t stop the company from being employed by the eventually victorious anti-establishment Trump campaign in 2016.

[00:16:06] It later emerged that Cambridge Analytica had told potential clients that it had also consulted on the Brexit campaign, the campaign to leave the EU in June of 2016, which happened six months before Trump’s victory in November.

[00:16:23] And the CEO of Cambridge Analytica, a tall man named Alexander Nix with an almost Bond-villain style, would present at conferences and give interviews where he talked about the power of the company’s data and analytics capabilities, boasting about how many academics and PhDs it employed, and how it was able to harness the power of big data to change public opinion about anything.

[00:16:52] Then, in March of 2018, the scandal broke.

[00:16:57] It was front-page news in The Guardian and the New York Times, and was all over cable news around the world the very same day. 

[00:17:06] For everyone who wasn’t involved in the world of political campaigning, so for almost everyone, it was the first time they had heard the name Cambridge Analytica.

[00:17:16] And it was frightening. 

[00:17:18] Experts were invited to come on TV and talk about this threat to democracy.

[00:17:24] Could free will continue to exist if companies were able to predict and change our voting patterns without our knowledge?

[00:17:33] For many, it was the first time that they had thought about the power and possibilities of the data that Facebook had on them.

[00:17:42] The hashtag #DeleteFacebook was trending on Twitter, and Mark Zuckerberg was dragged to testify before the US Congress while his company was losing billions of dollars of value.

[00:17:55] It made for a wonderful story.

[00:17:58] To people who had questioned how a country could possibly have voted for Brexit or Donald Trump, it provided an explanation. 

[00:18:07] The people had been tricked, psychologically profiled and targeted with messages to influence their opinion.

[00:18:15] There was even the fact that Alexander Kogan was born in Russia, which to some was a sign that he was a Russian spy. Journalists called him up and asked him point-blank whether he was a Russian agent.

[00:18:29] Suddenly it was clear, and everything made sense.

[00:18:33] But the reality is that the scandal was a bit of a storm in a teacup, it was a scandal over something that didn’t really happen.

[00:18:43] Kogan, the man who collected the data and ran the statistical models, has said numerous times that the data was practically useless on a personal level, even when it was first collected.

[00:18:57] What’s more, it was collected in 2014 and would have been pretty out of date in 2016 anyway. Data like this typically has a shelf-life of a maximum of 12 months.

[00:19:11] Plus, the fact that the Cruz campaign and the Trump campaign both got rid of Cambridge Analytica suggests that the people who paid to use the data, and supposedly used it to target voters with hyper-targeted advertising and swing an election, didn’t find it useful at all. 

[00:19:31] And as far as the question of whether the data had any impact on the Brexit vote, there was an extensive enquiry by the UK Information Commissioner’s Office, essentially an arm of government, which found absolutely no evidence that Facebook data held by Cambridge Analytica had any influence on the Brexit vote.

[00:19:53] So, let’s recap quickly, as there are two slightly different questions:

[00:19:58] Firstly, was there an illegal breach of Facebook data?

[00:20:03] And secondly, did this data actually influence anything?

[00:20:08] To the first question, no, it wasn’t illegal, but it was probably misleading. The users who authorised this application probably didn’t know that they were also giving their friends’ data, or that it might be used for commercial purposes.

[00:20:25] As a result, Facebook has clarified its policies on this, and third-party applications don’t get access to your friends’ data, and they get access to much less data than they used to get.

[00:20:39] If you authorise a third-party app now on Facebook, it’s a lot clearer, and these apps don’t tend to get access to very much.

[00:20:49] And to our second question, did the data actually influence anything, although it makes for a much juicier and more interesting story if you think it does, the reality certainly appears to be “no”.

[00:21:04] Now, as a final point, just in case this is mistaken for some great defence of Facebook or Cambridge Analytica, it most definitely isn’t. 

[00:21:14] The idea that you have accidentally given access to your personality, what you like and dislike, who you are, the idea that some evil third-party knows it is scary. It’s not nice, and it’s definitely not nice to think that you might have let this third-party have access to your nearest and dearest.

[00:21:35] But the reality is that this data, at least in the case of the Cambridge Analytica scandal, is not nearly as useful as you might think it to be.

[00:21:45] Indeed, there is one theory that the Cambridge Analytica scandal was actually excellent news for Facebook, because it made the world think that the data Facebook had on its users was significantly more valuable than Kogan’s analysis suggested it actually was.

[00:22:03] Now, to go back to our example of someone who likes “dogs, Mars bars and reading”. 

[00:22:10] Well, I like “dogs, Mars bars and reading”. 

[00:22:13] Maybe you like “dogs, Mars bars and reading” too.

[00:22:17] Does this mean that we’re similar, and most importantly, do these three things mean that we are likely to have the same political beliefs, care about the same issues, and are likely to respond in the same way to a political advert? 

[00:22:32] It’s a possibility, but from what Kogan’s research suggests, it certainly seems pretty unlikely.

[00:22:40] This is, of course, not to say that the information Facebook holds on you or me is worthless and unusable

[00:22:47] Facebook knows a lot more about us than whether we like dogs, Mars bars and reading, and hundreds of millions of dollars are spent every single day advertising to users on Facebook because of what Facebook knows about them. 

[00:23:02] But can this data be easily stolen by a third party, or even a foreign agent, and used to manipulate democracy? 

[00:23:12] Well, with the example of Cambridge Analytica, it certainly seems that even when we think it has, it really hasn't. 

[00:23:22] OK then, that is it for today's episode on the Cambridge Analytica Scandal.

[00:23:28] I hope it's been an interesting one, that you've learnt something new, and if you remember this scandal from a few years ago, well perhaps it has put a different perspective on it.

[00:23:39] As always, I would love to know what you thought of this episode.

[00:23:42] Do you remember this news coming out? How did you feel when it did?

[00:23:47] If you are a Facebook user, or I should probably say Meta now, shouldn’t I, anyway, how did it affect your behaviour, if at all?

[00:23:56] I would love to know, so let’s get this discussion started.

[00:24:00] You can head right into our community forum, which is at community.leonardoenglish.com and get chatting away to other curious minds.

[00:24:09] You've been listening to English Learning for Curious Minds, by Leonardo English.

[00:24:14] I'm Alastair Budge, you stay safe, and I'll catch you in the next episode.

[END OF EPISODE]

[00:00:00] Hello, hello hello, and welcome to English Learning for Curious Minds, by Leonardo English. 

[00:00:12] The show where you can listen to fascinating stories, and learn weird and wonderful things about the world at the same time as improving your English.

[00:00:22] I'm Alastair Budge, and today we are going to be talking about The Cambridge Analytica Scandal.

[00:00:29] It’s a story that brings together the power of Facebook, journalism, political prejudice, right vs. left, democracy, human psychology, free will, allegations of Russian spies working to influence foreign elections and a plot to put Donald Trump in the White House.

[00:00:49] The only problem is. 

[00:00:50] Is any of it even true? Right, let’s get started and talk about Cambridge Analytica.

[00:01:00] If you had bought the Guardian Newspaper on March 18th of 2018, on the front cover you would have seen a picture of a 28-year-old man with short, pink hair.

[00:01:14] Next to him, the all-powerful headline:

[00:01:18] “Revealed: 50 million Facebook files taken in record data breach.” 

[00:01:25] A breach, by the way, means a break, a hole that has been made in order to get in and access something.

[00:01:34] The story went on to reveal that a company called Cambridge Analytica had used the personal Facebook data of 50 million people to target US citizens, build a–and I’m quoting directly here–”psychological warfare tool” which targeted people based on their stolen data, and ultimately swing the election, to win the 2016 US presidential election for Donald Trump.

[00:02:04] It was a powerful story, and it was all over the news. 

[00:02:09] You may well remember the story yourself.

[00:02:12] It created big problems for Facebook, and the company’s share price. It lost $35 billion in value within a day, and in the subsequent months it lost over $100 billion in stock market value.

[00:02:29] The company behind the supposed data breach, Cambridge Analytica, went out of business, and its former CEO was forced to testify in front of the British parliament.

[00:02:43] The journalist behind the story, Carole Cadwalladr, was invited to give a TED talk, and won the prestigious George Orwell prize for journalism for breaking this story.

[00:02:56] So, what we are going to try to answer in this episode is what actually happened? 

[00:03:03] Is democracy really at threat from bad actors using personal data to influence our voting behaviour? 

[00:03:11] Or is the entire story completely overblown

[00:03:15] So, let’s start with what happened. 

[00:03:19] The main character in the story, without whom none of this would have been possible, was a man called Aleksandr Kogan. 

[00:03:27] He was born in the USSR, the former Soviet Union, but moved to the United States when he was 7 years old.

[00:03:35] Kogan excelled at school, and showed a remarkable talent for mathematics and physics. 

[00:03:43] At university he became increasingly interested in psychology, in understanding why humans feel certain emotions, and what makes us behave in the way we do.

[00:03:56] He had been working at Cambridge University since 2012, and saw from early on the potential power that a then newish technology company, Facebook, had. If you used Facebook back in 2012, I imagine it played a bigger, or at least more obvious, role in your life than it does now. 

[00:04:20] You used that “like” button to like anything from a status update from a friend to a “raising money for dog shelters” campaign. 

[00:04:29] You probably didn’t think too much about what happened after you clicked that “Like” button. After all, Facebook was a fun way to keep in touch with friends and family, share photos and generally see what was going on in the world. 

[00:04:46] To Kogan, however, it seemed like the most incredible dataset on humanity. 

[00:04:53] People like you and me were spending all this time on Facebook, liking stuff, interacting with it, giving Facebook data on what we like and what we don’t like, what we engage with and what we aren’t interested in.

[00:05:08] What’s more, Facebook could see who all of our friends were, so it had this amazing understanding of connections between all of its users.

[00:05:18] In 2014, Facebook passed 1.3 billion users, each of whom was spending an average of 40 minutes a day on the network. 

[00:05:30] So not only did it have a vast amount of people using it, but they were using it a lot.

[00:05:37] To someone like Kogan, this was fascinating.

[00:05:41] What’s more, at the time, Facebook had recently started to allow third-parties to develop applications on top of Facebook that would give them certain permissions to view the data on Facebook users.

[00:05:56] If you can think back to this time, if you were a Facebook user that is, you might remember a load of quizzes where you granted the quiz access to your Facebook account, answered a few questions, and it would tell you things like what Harry Potter character you were like or what member of the Beatles you would have been.

[00:06:16] Silly stuff, harmless fun. Or so most people thought.

[00:06:21] Indeed, back in 2014 these third-party applications were able to access some data not just about you but also about your friends.

[00:06:32] Someone might give access to a quiz to find out what type of pizza they are, agreeing to share their data with the app, but not realise that they were actually allowing the application to access information about their friends as well.

[00:06:49] Kogan’s question was, was any of this Facebook data actually any good at telling you anything useful about that person’s personality?

[00:06:59] If someone on Facebook liked “dogs, Mars bars and reading”, did this actually tell you anything about them as an individual?

[00:07:10] But Kogan was no thief, and he couldn’t just take this data from Facebook. Facebook users needed to give him their permission to use it. 

[00:07:20] And it’s here that we meet Cambridge Analytica. 

[00:07:24] Cambridge Analytica, which has no links whatsoever to Cambridge University, by the way, was a political consulting company founded in 2013.

[00:07:35] Its supposed speciality was advising political campaigns, advising politicians on how to use social media and data to put the right message in front of the right people at the right time.

[00:07:50] Cambridge Analytica paid Kogan to create an app that would collect data on Facebook users. 

[00:07:57] Cambridge Analytica would be able to use the data for use with its political clients, and Kogan could continue his academic research with the data he collected.

[00:08:08] A win-win situation, one where both parties benefited.

[00:08:13] So, Kogan, with Facebook’s permission, built a simple application called “This Is Your Digital Life”. 

[00:08:21] It was a quiz, essentially. 

[00:08:23] Users were offered three or four dollars to participate in the survey, and in exchange they gave the app permissions to access their Facebook data.

[00:08:35] What they probably didn’t realise, though, was that they were actually giving the app permission to see some data on all of their friends.

[00:08:43] So if I authorised the app, if I took the quiz, the app would get data on all of my friends. If I had 500 friends, for example, the app got the data on those 500 friends.

[00:08:58] The quiz was taken by around 300,000 people, each one with hundreds of friends, meaning that Kogan now had Facebook data on almost 90 million people. The initial Guardian report said 50 million, but Facebook was later forced to admit that it had actually been almost double this number.

[00:09:19] Getting this data wasn’t cheap, as the app had to incentivise people to take the personality test. Cambridge Analytica provided the financing for it, about a million dollars, believing that the data that would come out at the end of it would be significantly more valuable.

[00:09:38] Now, let’s just pause to address a couple of points first, before we move on with the story.

[00:09:44] Kogan had Facebook’s permission to create the quiz application. There’s no dispute about that.

[00:09:51] What is disputed is that Kogan says he was given permission to sell this data on to third-parties. Facebook says he wasn’t.

[00:10:01] Where it does become a little blurry, a little less clear, is about how the data on Facebook users was accessed, and the use of the term “breach”.

[00:10:12] As a reminder, a breach is when illegal access is provided to something.

[00:10:18] At the time, Facebook’s default settings meant that if you gave an app permission to your data, by default it could get data on all of your friends. 

[00:10:29] The Facebook users likely didn’t understand exactly what they were giving their permission for, but Kogan’s app was one of hundreds, thousands even, that was collecting very similar amounts of data.

[00:10:43] So Kogan wasn’t doing anything illegal, and indeed he was doing what practically every other Facebook app was doing. 

[00:10:52] Ok, so with those clarifications out of the way, let’s return to our story.

[00:10:58] You might be thinking, what is Kogan actually getting out of all of this?

[00:11:03] Was he really a Russian agent, was he a closet Trump supporter, did he want to wreak havoc in foreign elections and provide data that could be used to undermine democracy?

[00:11:15] That would make for a good story, but unfortunately it doesn’t seem to be true.

[00:11:21] He says that he was unaware that the data would be used for political targeting, and his interest in doing all of this was primarily from a research perspective.

[00:11:33] He wanted to know whether all of this data he was collecting on people - what they liked, where they lived, what they did, whether this data actually helped predict anything about their personalities.

[00:11:46] Remember, he is an amazingly talented scientist with an interest in human psychology. He wanted to know how human behaviour and personality can be predicted.

[00:11:58] And as for Cambridge Analytica, if Kogan’s work proved to be able to predict human behaviour and personality, it would be amazingly valuable. 

[00:12:08] Imagine that just by knowing that if someone liked Mars bars, dogs and reading they would be particularly receptive to a certain type of political message, and if someone liked singing, learning languages and had a birthday in December then they would be receptive to another type of political message.

[00:12:29] Armed with this information on people, and with Facebook, the ability to target over a billion people, and the vast majority of the voters in the United States, this would be an incredibly powerful weapon.

[00:12:44] But how could you actually figure out whether the data was useful or not?

[00:12:49] Well, that’s where the personality quiz came in. 

[00:12:54] After someone gave access to Kogan’s application, they needed to fill out the personality quiz. 

[00:13:01] Sure, you can say that people aren’t very good at assessing their own personality, but as there was no reason for someone to fill in a question incorrectly, and the test was completely randomised, this data should have been accurate.

[00:13:17] After the Facebook user had completed the quiz, Kogan’s algorithm looked at their answers and classified them by five different personality traits - such as if they were an extrovert or introvert, how open they were to new ideas, and so on.

[00:13:34] It was then time to see whether all the data he had collected from Facebook was actually any good at predicting these people’s personality.

[00:13:44] It was crunch time.

[00:13:46] Unfortunately, according to Kogan, the algorithms managed to correctly predict someone’s personality 1% of the time. That is 1%, 1 one out of 100 times, meaning that 99% of the time it got it wrong. 

[00:14:04] The Facebook data simply didn’t seem to be any good at all at predicting someone’s personality.

[00:14:11] Now, you might have thought that this would be the end of the story, with Cambridge Analytica cutting its losses as the data was junk.

[00:14:20] But really it was just the start.

[00:14:23] Cambridge Analytica seemed utterly uninterested in how accurate the data actually was.

[00:14:30] Instead, it continued to tell its clients that it had an incredibly valuable dataset that could be used for political targeting

[00:14:39] To be precise, it said it had up to 5,000 data points on over 220 million Americans.

[00:14:48] The company also had connections in high places. Its investors included the billionaire Republican donor, Robert Mercer, and the man who would later go on to become Donald Trump’s chief strategist, Steve Bannon.

[00:15:03] These powerful connections would give it a head start in the US political scene, and would later result in it working on the 2016 US presidential campaign.

[00:15:15] While it would be most famous for working with Donald Trump, its first client was actually his competitor, the Republican establishment candidate, Ted Cruz.

[00:15:26] Cambridge Analytica had managed to sell Cruz on the power of its data and analytics capabilities, but as you will know, Cruz was heavily beaten by Trump in the nomination to be the Republican candidate for president.

[00:15:42] Sure, you might think, data can’t compensate for an uninspiring candidate, and there was only a limited amount that Cambridge Analytica could have done for Cruz. 

[00:15:53] In any case, it didn’t work.

[00:15:56] But that didn’t stop the company from being employed by the eventually victorious anti-establishment Trump campaign in 2016.

[00:16:06] It later emerged that Cambridge Analytica had told potential clients that it had also consulted on the Brexit campaign, the campaign to leave the EU in June of 2016, which happened six months before Trump’s victory in November.

[00:16:23] And the CEO of Cambridge Analytica, a tall man named Alexander Nix with an almost Bond-villain style, would present at conferences and give interviews where he talked about the power of the company’s data and analytics capabilities, boasting about how many academics and PhDs it employed, and how it was able to harness the power of big data to change public opinion about anything.

[00:16:52] Then, in March of 2018, the scandal broke.

[00:16:57] It was front-page news in The Guardian and the New York Times, and was all over cable news around the world the very same day. 

[00:17:06] For everyone who wasn’t involved in the world of political campaigning, so for almost everyone, it was the first time they had heard the name Cambridge Analytica.

[00:17:16] And it was frightening. 

[00:17:18] Experts were invited to come on TV and talk about this threat to democracy.

[00:17:24] Could free will continue to exist if companies were able to predict and change our voting patterns without our knowledge?

[00:17:33] For many, it was the first time that they had thought about the power and possibilities of the data that Facebook had on them.

[00:17:42] The hashtag #DeleteFacebook was trending on Twitter, and Mark Zuckerberg was dragged to testify before the US Congress while his company was losing billions of dollars of value.

[00:17:55] It made for a wonderful story.

[00:17:58] To people who had questioned how a country could possibly have voted for Brexit or Donald Trump, it provided an explanation. 

[00:18:07] The people had been tricked, psychologically profiled and targeted with messages to influence their opinion.

[00:18:15] There was even the fact that Alexander Kogan was born in Russia, which to some was a sign that he was a Russian spy. Journalists called him up and asked him point-blank whether he was a Russian agent.

[00:18:29] Suddenly it was clear, and everything made sense.

[00:18:33] But the reality is that the scandal was a bit of a storm in a teacup, it was a scandal over something that didn’t really happen.

[00:18:43] Kogan, the man who collected the data and ran the statistical models, has said numerous times that the data was practically useless on a personal level, even when it was first collected.

[00:18:57] What’s more, it was collected in 2014 and would have been pretty out of date in 2016 anyway. Data like this typically has a shelf-life of a maximum of 12 months.

[00:19:11] Plus, the fact that the Cruz campaign and the Trump campaign both got rid of Cambridge Analytica suggests that the people who paid to use the data, and supposedly used it to target voters with hyper-targeted advertising and swing an election, didn’t find it useful at all. 

[00:19:31] And as far as the question of whether the data had any impact on the Brexit vote, there was an extensive enquiry by the UK Information Commissioner’s Office, essentially an arm of government, which found absolutely no evidence that Facebook data held by Cambridge Analytica had any influence on the Brexit vote.

[00:19:53] So, let’s recap quickly, as there are two slightly different questions:

[00:19:58] Firstly, was there an illegal breach of Facebook data?

[00:20:03] And secondly, did this data actually influence anything?

[00:20:08] To the first question, no, it wasn’t illegal, but it was probably misleading. The users who authorised this application probably didn’t know that they were also giving their friends’ data, or that it might be used for commercial purposes.

[00:20:25] As a result, Facebook has clarified its policies on this, and third-party applications don’t get access to your friends’ data, and they get access to much less data than they used to get.

[00:20:39] If you authorise a third-party app now on Facebook, it’s a lot clearer, and these apps don’t tend to get access to very much.

[00:20:49] And to our second question, did the data actually influence anything, although it makes for a much juicier and more interesting story if you think it does, the reality certainly appears to be “no”.

[00:21:04] Now, as a final point, just in case this is mistaken for some great defence of Facebook or Cambridge Analytica, it most definitely isn’t. 

[00:21:14] The idea that you have accidentally given access to your personality, what you like and dislike, who you are, the idea that some evil third-party knows it is scary. It’s not nice, and it’s definitely not nice to think that you might have let this third-party have access to your nearest and dearest.

[00:21:35] But the reality is that this data, at least in the case of the Cambridge Analytica scandal, is not nearly as useful as you might think it to be.

[00:21:45] Indeed, there is one theory that the Cambridge Analytica scandal was actually excellent news for Facebook, because it made the world think that the data Facebook had on its users was significantly more valuable than Kogan’s analysis suggested it actually was.

[00:22:03] Now, to go back to our example of someone who likes “dogs, Mars bars and reading”. 

[00:22:10] Well, I like “dogs, Mars bars and reading”. 

[00:22:13] Maybe you like “dogs, Mars bars and reading” too.

[00:22:17] Does this mean that we’re similar, and most importantly, do these three things mean that we are likely to have the same political beliefs, care about the same issues, and are likely to respond in the same way to a political advert? 

[00:22:32] It’s a possibility, but from what Kogan’s research suggests, it certainly seems pretty unlikely.

[00:22:40] This is, of course, not to say that the information Facebook holds on you or me is worthless and unusable

[00:22:47] Facebook knows a lot more about us than whether we like dogs, Mars bars and reading, and hundreds of millions of dollars are spent every single day advertising to users on Facebook because of what Facebook knows about them. 

[00:23:02] But can this data be easily stolen by a third party, or even a foreign agent, and used to manipulate democracy? 

[00:23:12] Well, with the example of Cambridge Analytica, it certainly seems that even when we think it has, it really hasn't. 

[00:23:22] OK then, that is it for today's episode on the Cambridge Analytica Scandal.

[00:23:28] I hope it's been an interesting one, that you've learnt something new, and if you remember this scandal from a few years ago, well perhaps it has put a different perspective on it.

[00:23:39] As always, I would love to know what you thought of this episode.

[00:23:42] Do you remember this news coming out? How did you feel when it did?

[00:23:47] If you are a Facebook user, or I should probably say Meta now, shouldn’t I, anyway, how did it affect your behaviour, if at all?

[00:23:56] I would love to know, so let’s get this discussion started.

[00:24:00] You can head right into our community forum, which is at community.leonardoenglish.com and get chatting away to other curious minds.

[00:24:09] You've been listening to English Learning for Curious Minds, by Leonardo English.

[00:24:14] I'm Alastair Budge, you stay safe, and I'll catch you in the next episode.

[END OF EPISODE]