How does it feel, sitting there on the digital shelf? Have you checked your best-buy date? I think I’m still good for a few more years yet.
It may not feel like it, but on the internet, the product companies are selling is you. Facebook isn’t a social media company, it’s a people factory. It processes you, formats you, and wraps you up in a neat little database. Then it mass produces you and sells you at a discount to anyone with a credit card.
Four years ago, a British political consulting firm named Cambridge Analytica, colluded in a campaign to capture profile information from Facebook users. In the end, it would lead to a scandal involving the user information of more than 70 million Americans, the use of psychometrics as a new political tool, and an influence campaign that may have turned the tide in two world-altering elections a continent apart.
Let’s start at the beginning. In 2014, a lecturer from Cambridge University, Aleksandr Kogan, formed a UK company called Global Science Research (GSR). He then developed a Facebook app posing as a personality survey. He paid American Facebook users $1 to $4 to download the app and fill out the personality test, for a total of nearly $800,000. In the process, those users gave the app permission to collect their profile data. Whether Kogan did this on his own or at the encouragement of Cambridge Analytica is open to debate, depending with whom you talk.
In any case, around 270,000 people downloaded the app and filled out the survey. Next to America’s population of 325 million, that may not sound like many people, but under Facebook rules at the time (which were changed in 2015 in response to this incident), when users gave the app permission to collect their profile data, they also gave the app permission to collect the profile information of their friends as well. Since the average Facebook user has between 100-500 friends, this meant the app was able to collect the profile information of nearly 87 million people.
The data they collected wasn’t simply ordinary information like work history and places lived. They also pulled other user data which Facebook collects, such as the posts you’ve ‘liked,’ status updates you’ve posted, and the groups you belong to.
Kogan then began working with another company, Strategic Communications Laboratories (SCL), the parent company of the aforementioned Cambridge Analytica. Up until this point, Kogan had not done anything illegal or against Facebook’s terms and conditions. But when he shared the data with SCL, he broke Facebook’s rules, which stipulate data acquired through an app cannot be shared with another entity without first obtaining Facebook’s permission.
SCL is a private behavioral research and strategic communications company, purchased by billionaire conservative donor, Robert Mercer, in 2013. They analyze large sets of data and attempt to identity patterns in it for use in political marketing. Taking Kogan’s data, with information about pages you follow, posts you like and create, comments you leave, and much, much more, a team of psychologists and data analysts looked for ways to target people for maximum effect. It’s called psychographic profiling and it’s the new weapon in political warfare.
Let me give you a real-world example of the type of data these apps collect. If I go to my Facebook settings and select ‘Apps,’ I get a list of the apps that I’ve used on Facebook. Clicking on an app pulls up a screen that tells me what permissions I have granted. In the app “80’s One Hit Wonders,” which I don’t even remember signing up for, it lists nearly 20 different categories of information to which the app has access. This includes my hometown, birth-date, friends list, work and education history, religious and political views, status updates and more than a dozen other categories. I am most definitely deleting this app.
This is the type of information Kogan shared with Cambridge Analytica, through their parent company SCL. Cambridge Analytica, a subsidiary of SCL founded just after Mercer’s acquisition of the company, was the brainchild of Mercer political advisor and former Trump Chief Strategist, Steve Bannon. The creation of Cambridge Analytica was an attempt to harness the psychological techniques of its parent company for the domestic political scene, and was used by several important political campaigns, including those of Ted Cruz and Donald Trump, as well as the Brexit initiative which successfully withdrew the United Kingdom from the European Union.
What sets SCL and Cambridge Analytica apart from other similar data-marketing companies is the way they approach their influence campaigns. They employ a developing science called “psychographic targeting.” This is the process of tweaking your market-targeting based on the psychological characteristics of your intended audience.
Cambridge Analytica’s parent company, SCL, first honed its skills in cyber-psychological warfare by messing with the elections in developing countries: “Psyops. Psychological operations – the same methods the military use to effect mass sentiment change,” a former Cambridge Analytica employee told The Guardian in May 2017. “It’s what they mean by winning ‘hearts and minds.’ We were just doing it to win elections in the kind of developing countries that don’t have many rules.”
This anonymous former employee is speaking about the company’s work prior to 2013, before the success of SCL’s foreign influence campaigns attracted the interest of wealthy American hedge fund manager and tech entrepreneur, Robert Mercer, and his political ally, Steve Bannon, who were looking to bring those modern techniques of psychological warfare to the political battlefield back home.
Imagine targeting users who are members of the Facebook group, Mothers Against Drunk Driving (MADD), with ads depicting horrific car crashes and a message suggesting one of the candidates in a political race will go easy on drunk drivers. Would such a campaign be likely to sway some of those voters, even if its claims were untrue?
Now, in lieu of drunk driving, imagine instead targeting the darkest aspects of human nature: racism, hate, sexism, the worst extremes of political partisanship. Afraid someone will take away your guns? There’s an ad for that. Worried about your religious liberty? Don’t worry, there’s an ad for that. Hate immigrants or Muslims? There’s a – well, you get the picture.
And it gets even more deeply duplicitous than that. Not only did they target the most vulnerable people on the political fringe, but those targeted ads might link to articles on fake news websites which look eerily similar to real news sites like Fox or MSNBC. The whole idea is to trick visitors into thinking they are viewing an article from a legitimate source. The web address of the page might be “msnbc.com.co” but most people won’t even notice the extra “co” at the end. Even the links back to the homepage at the top of the article will likely take visitors back to the real MSNBC website, so that anyone leaving the page will think they’ve just read an article published and endorsed by a legitimate news organization. In this way, innocent people become unwitting conspirators in spreading fake news; and it helps fuel the public’s current distrust of national news sources.
This scandal with Cambridge Analytica has caused an identity crisis for Facebook, too. On the surface, Facebook appears to be a platform designed to facilitate communication, and that is the description promoted by the company itself, but a number of cracks have begun to show through this carefully constructed facade.
The scary truth, which nobody wants to talk about, is that Facebook is a company designed to make money for its creators and stockholders. It does this by encouraging the sharing of personal data by its users, and then making that information available for use by marketers who buy ads on the platform. The more users the platform has, and the more data those users share, the more valuable Facebook is to its investors. Facebook is confronted with the dilemma of needing to reassure its users that their information is safe, even as their business model is designed to exploit the information of those very same users.
Facebook itself is built to addict its users. The more people using the platform, the more ads that can be shown, and the more money Facebook makes. The constant endorphin-spiking feedback loop of likes, notifications and updates, serves to addict users as surely as any drug. “They’ve created the attention economy and are now engaged in a full-blown arms race to capture and retain human attention, including the attention of kids,” says Tristan Harris, a former Google design ethicist, who now serves as a senior fellow for the nonprofit advocacy group, Common Sense Media.
The internet has changed the face of commerce. But the most important product being purchased on the internet is not the latest toy marketed on Amazon, or the newest video streaming service. In the internet age, the most valuable commodity is you. Your information, your vote, and your efforts in pushing the agenda of those with money, means, and power.
Eric W. Austin lives in China and writes about community issues and technology. He can be reached by email at firstname.lastname@example.org.
- ERIC’S TECH TALK – The A.I. Singularity: Are you ready?
- ERIC’S TECH TALK: Why we’re losing the battle for personal privacy
- ERIC’S TECH TALK: My bipolar relationship with the Internet
- ERIC’S TECH TALK – Fake news: coming to a town near you
- TECH TALK: My deep, dark journey into political gambling
- TECH TALK: Does the future spell the end of local news?
- TECH TALK: Net Neutrality goes nuclear
- TECH TALK: Are you human or robot? The surprising history of CAPTCHAs
- TECH TALK: Life & Death of the Microchip
- TECH TALK: Bug hunting in the late 20th century