It’s been one year since Christopher Wylie came forward as a whistleblower. The revelations were shocking and the impact was widespread. Data mining of Facebook users’ personal information had influenced key political votes. Cue the ignition of the Cambridge Analytica scandal.
If nothing else, the events that unfolded revealed that the business model and policies in place at Facebook allowed for such unethical data mining practices. And it had been going on for years.
Since then, it seems like not a day goes by without more news breaking about Facebook.
Either the company is breaching user trust by selling personal information in deals with 3rd parties, or it’s committing to questionable future plans.
It’s hardly surprising, though.
This is a company born out of controversy. If you’ve seen The Social Network movie, you know all about the Winklevoss brothers, Divya Narendra and Eduardo Saverin.
Here’s what happened.
The Winklevoss twins, along with Narendra, had the original idea for HarvardConnection, a social networking platform for Harvard students. Zuckerberg entered into a verbal contract with the twins to help them finish the site. A few months later, he registered TheFacebook.com instead.
Saverin, a founding partner who funded the initial servers to run the site, was deceptively forced out of the company. Zuckerberg managed this by diluting Saverin’s stake in the company from 30% to below 10% between June 2004 and January 2005.
Over the years, plenty more drama has hit the front door of 1 Hacker Way (that’s Facebook HQ). From an attempted smear against Google that backfired, to an open back door for NSA surveillance of user data.
Most recently, the New York Times reported that the social media giant is under criminal investigation. Federal prosecutors are looking into its data deals with other major technology companies.
Clearly, Facebook has not been having the best time lately.
And it all kicked off with the Cambridge Analytica scandal.
As Facebook COO Sheryl Sandberg declared at the time, Facebook suffered “a huge breach of trust”.
So, as we approach the one-year anniversary, what better time to look back and take stock of what has been an eventful year for Facebook?
Let’s get the party started. 🎉
Facebook Deceived You and Millions of Others. What Now?
This is what Mark Zuckerberg said after the Cambridge Analytica scandal:
We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you,” Zuckerberg wrote. “I’ve been working to understand exactly what happened and how to make sure this doesn’t happen again.
He promised that the company would investigate all third-party apps that had access to large amounts of data before 2014 (when Facebook prevented app developers accessing data from the friends of users).
He added that the site will ban any app developers that don’t comply with a full audit and inform their users if a violation is found.
The question is: Can you believe him? In fact, can you trust any of Zuckerberg’s statements?
During an interview in 2009, he serenely stated that: “We’re not going to share people’s information except for with the people they’ve asked for it to be shared.”
We all know the opposite has happened.
A Brief Timeline of Facebook’s 2018 Drama
The Effects of Political Advertising on Facebook
We’re not sure when it happened.
We’re not sure how it happened.
But at some point, it seems like we all just got used to the idea that we are targeted by online advertisers any time we visit a website.
The idea may have been creepy, weird, and kind of dystopian in the past. But now it’s here. It happens. It’s part of life. And we accept it.
As a result, it probably wasn’t that difficult for Aleksandr Kogan, a researcher at Cambridge University, to lure around 270,000 people to take part in a survey.
An important note here is that before 2015, Facebook’s privacy rules allowed apps to collect data on the friends of users without their explicit consent.
That meant users who installed the survey app, shared information about both themselves and their friends. Kogan offered the collected data to Cambridge Analytica, which in turn used it to influence Donald Trump’s 2016 Presidential campaign.
Traditional messaging for a political campaign may be largely based on party affiliation alone. Perhaps with one or two other pieces of general demographic information.
What this data set allowed Cambridge Analytica to do was sell the idea of ‘psychographic modeling’ to the political sphere. This ends up with voters being targeted with extremely specific content that directly appeals to them or motivates a reaction from them.
With political advertising, we notice it more when we disagree with its message. Negative campaigning works because we tend to believe negative things about others. If Facebook apps harvest, via quizzes, psychographic profiles of users then their fears can be played too. […] How such a campaign might work through digital microtargeting is scary.
The survey was able to achieve this by using what’s known as the Ocean model, which aims to group personality traits into distinctions that hold across cultures and across time.
For example, a woman who describes herself as ‘quiet’ is likely to also describe herself as ‘shy’. If she agrees with that description this year, she’s probably going to agree with it next year too.
The group of people who describe themselves as ‘quiet’ and ‘shy’ is likely to become apparent no matter what language the test is taken in. And if a person says they are ‘quiet’ and ‘shy’, there are likely to be big differences between them and people who say they are ‘loud’ and ‘outgoing’.
I mean, we kind of knew that Facebook advertisers attempt to manipulate our thoughts.
But it’s just ads, right?
There’s no way that the constant stream of posts we see social media platforms would ever be able to influence our political views and affect our electoral decisions.
Well, um, yeah it kind of did.
Facebook’s Explanation: It’s Not A Bug, It’s A Feature
Sure. I mean, whatever you say, Zuck.
The Cambridge Analytica breach is a known bug in two senses. Aleksandr Kogan … didn’t break into Facebook’s servers and steal data. He used the Facebook Graph API, … which allowed people to build apps that harvested data both from people who chose to use the app, and from their Facebook friends. As the media scholar Jonathan Albright put it: ‘… the vast majority of problems that have arisen as a result of this integration were meant to be features, not bugs.’
Help yourself to some free internet points if you did.
For those who got distracted or fell asleep after the first few sections, don’t worry. It’s totally understandable. Very few people have the time or patience to read a 4,519* word legal document.
*Yes, I counted.
You know what else doesn’t help? The fact that by the time you finish reading the whole policy, Facebook has probably amended it again.
Over the years, the world’s favorite data mining company has continually changed its privacy policies. Sometimes giving users the illusion that they have more control over what they post on the platform.
Data privacy rules and practices are not set in stone at Facebook (and the same applies to many other companies). Since privacy policies are always changing, you always have to keep yourself updated.
Not to mention, you’ll also have to overcome the trust factor. A social media platform, with a business model based on selling user data, that constantly changes its mind about how your personal data is handled?
Here is just a short overview of the privacy policies changes that occurred before Aleksandr Kogan’s survey was launched:
And the problem with Facebook’s privacy policies weren’t just the constant changes, but their lack of clarity.
A trend that continues to this day.
In April 2018, following the Cambridge Analytica scandal, Facebook rewrote its terms of service and data policies again. The company wanted to better explain what data it collects on users.
The policies include more details than previous versions, which is super nice and everything, but they also became much longer.
Ethan Zuckerman calls this a bargain:
[A bargain] in which people get content and services for free in exchange for having persuasive messages psychographically targeted to them, as the ‘original sin’ of the internet. It’s a dangerous and socially corrosive business model that puts internet users under constant surveillance and continually pulls our attention from the tasks we want to do online toward the people paying to hijack our attention. It’s a terrible model that survives only because we haven’t found another way to reliably support most internet content and services—including getting individuals to pay for the things they claim to value.
Online Manipulation Taken to the Next Level
Online surveillance is not a new thing. And we’ve been recently covering how seriously invasive it can become.
If you think about it, Facebook is the perfect tool to easily manipulate and monitor people as they willingly expose their life and activities online. Governments and companies are inevitably going to take advantage of this tool to fulfill their plans and maximize their profits.
It can’t be only political campaigners who used the likes of Cambridge Analytica to pickpocket our personal data; surely we’ll learn soon of the major corporations that similarly played on our online hopes and fears to sell us stuff. But we don’t have to have the full picture to know that we have to act. It could be regulation; it could be anti-trust legislation to break up those tech giants that act as virtual monopolies.
Facebook’s connection with Cambridge Analytica goes deeper too.
Ars Technica discovered that since 2015, many Android users who downloaded Facebook apps granted those apps permission to access their contacts, including call and message logs.
Of course, users had no idea about it. Facebook continued this practice until October 2017, when Google changed the way Androids store their data.
That didn’t stop Facebook from pursuing this strategy though.
Earlier this year, the Wall Street Journal revealed that users who confess their most intimate secrets to apps on their smartphones are unwittingly feeding that data to Facebook.
Alarm bells also began to sound when Mark Zuckerberg announced plans to link together Whatsapp, Instagram, and Facebook Messenger. The glaring issue here is Facebook’s increased access to metadata as a result of this.
WhatsApp users require a telephone number to register, while Facebook users are asked to provide their true identities. By introducing cross-app messaging, Facebook will ultimately be even more effective at linking these identifiers in its pursuit of complete user profiles.
What Does Online Advertising on Facebook Imply?
Facebook, along with Google and Alibaba, are the three giants that will attract over 60 percent of global spending on digital advertising in 2019.
Marketers and advertisers run to these companies because they know their rate of success will be the highest possible. Not only can they reach billions of people around the world, but they know they can target the right audience with their advertisements.
Your data is the product that these companies sell. Protecting yourself from it being misused or stolen is one of the greatest challenges we face today.
Columnist James Ball makes the following analogy:
If data is the new oil, then the oil wells are in the hands of a few billionaires, and we’re being pumped through the pipes. To see the extent to which this is true, we need to look no further than the increase in American wealth held by the richest 0.1% (about 160,000 families) of US society. It has risen from 7% in 1978 to more than 20% today, according to Stanford University research, with the bulk of this increase happening in the dotcom era.
Facebook Has Altered Our Online Habits, and It’s Not Healthy
Facebook now has 2.3 billion monthly active users worldwide.
Research suggests that high levels of social media use can impact decision-making to a similar extent as drug or gambling addiction.
I’m going to make a wild guess here: that probably leads to a pretty big number of status updates, check-ins, likes, comments, and shares every day.
But why do we do it? What is it about Facebook that hooks so many people into regularly using the platform?
Well, cognitive science suggests that social media appeals to hard-wired mechanisms in the brain. These mechanisms help us to determine our social standing within groups.
Generally speaking, a person’s standing within a group is tied to their power (their control over resources) or their popularity (how well-liked they are). Social media very directly taps into this second category.
Not only does it allow you to keep up with the latest and greatest trends within the communities that matter most to you, but it also allows you to compete for likes and comments in order to establish your social standing online.
Have you ever posted a particularly funny joke? Maybe a really nice picture? It feels good when those likes flood in. Evolutionarily so, in fact. This is because of a little thing called dopamine.
Dopamine is a chemical released the brain that plays a super important role in motivating your behaviors. A small amount is released when you eat some delicious food. A slightly bigger amount is delivered when you have sex. You get a little hit after you exercise. And you also receive dopamine when you have successful social interactions.
It’s the brain’s way of rewarding you for beneficial behaviors in the hope that you repeat them. You want that good feeling again, right?
Social media apps play on this chemical structure in your brain.
There is even research that suggests a high level of Facebook use can lead to worse decision-making. It could have a comparable effect to drug or gambling addictions.
“With so many people around the world using social media, it’s critical for us to understand its use [and] there’s a dark side when people can’t pull themselves away. We need to better understand this drive so we can determine if excessive social media use should be considered an addiction.” said Kar Meshi, author of the paper.
It’s not healthy, folks.
How Much of What Facebook Did Is Legal?
Now, this is a sensitive topic. In just one week after the Cambridge Analytica scandal emerged, Facebook has faced four lawsuits.
Interestingly, one of them came from a Facebook user who filed it on behalf of 50 million people whose data was used during the 2016 elections.
Remember: it’s on you to make use of what rights you have.
Mark Zuckerberg also testified in front of the American Congress in April last year.
Facebook gave multiple companies the ability to access user data long after it had stopped sharing data with most third-parties. The companies involved in such deals with Facebook include Netflix, Spotify, Microsoft, Sony, and Amazon.
Collecting people’s personal data for a different purpose than the one they gave their consent to is not only a trap; it is also not legal.
Hopefully, after everything the company has been through, Facebook will eventually begin to understand that.
The introduction of the GDPR means that EU citizens are now more protected, at least.
Based on the regulation, users have the right to request an account of the information a company has on them. That company would then have to provide a digital file containing the information at no cost to the user.
European users also have the right to erasure, which means they can withdraw consent at any time. This forces companies to get rid of all gathered data on them.
The state of California has also passed a similar law. It allows residents similar rights relating to the information that companies hold on them, including the rights to view and erase all data.
Regulation is one way to transfer some of the power away from global powers like Facebook, and back into the hands of the people whose data they exploit. But even if you’re lucky enough to live somewhere that is covered by the GDPR or an equivalent regulation, remember one thing:
It’s still on you to make use of these new rights.
Your Privacy Is Your Responsibility
Facebook may try to change for the better. The company has publicly committed to a privacy-focused vision of the future.
However, past experience suggests that we should pay close attention to what Facebook actually does. Not what Mark Zuckerberg says.
And with new scandals constantly coming to light, whether it’s user passwords being stored in plain text or 1.5 million email contacts uploaded without consent, it really does seem like you can’t trust Facebook these days.
Governments and organizations could (and probably should) work together to make technology companies and online tools safer for everyone. And this includes how personal data is harvested and exploited.
However, in this world, you simply cannot rely on others. If you want to properly take care of your online data, you must do it yourself.
Think twice about what you post and what data you submit online.
Be critical of sponsored posts and the ads you see.
Take care with how much time you spend on social media.
Or you can also decide to delete your Facebook account permanently.
Original post by Dana Vioreanu on 4 April 2018. Updated by Tom Bradbury.