I still remember what I was doing the moment I first heard about Facebook’s ties to the now-infamous Cambridge Analytica. I’d just sat down at my desk, ready to face the day with a cup of hot coffee in hand. As I often do before starting work, I checked the news alongside my emails – all the better to keep myself apprised of the goings-on in the tech industry.
At first, I couldn’t believe what I was reading. A company that has, for decades, worked behind the scenes to influence the ideas and behavior of voters across the world. An organization whose activities may have manipulated the flow of several major elections – including one last year in Kenya.
An organization which, using the data of over 87 million Facebook users, may have played a major role in Trump’s 2016 election victory.
I had to double-check to ensure I hadn’t stumbled across some tinfoil hat conspiracy blog. The story read like something you might see on some dark corner of the Internet, dreamed up by someone with more time than sense. No such luck there – I was on the website of The Washington Post.
“In interviews with The Guardian and The New York Times, [Christopher] Wylie, [former research director at Cambridge Analytica], confirmed that his company had taken the data from millions of users without their consent,” writes Sue Halpern of New Republic. “Cambridge Analytica had used the information to identify American’s subconscious biases and craft political messages designed to trigger their anxieties and thereby influence their political decisions…[they] turned this technique sideways, with messaging that exploited people’s vulnerabilities and psychological proclivities.”
The more you think about it, the more unpleasant it gets. Like it or not, we all have psychological blind spots. We all have vulnerabilities, insecurities, and uncertainties.
And there have always been people in both the private and public sector who would use those against us. Advertisers who manipulate people into purchasing their products by chipping away at their self-esteem. App designers who study their target audience to create as addictive an experience as possible.
Firms like Cambridge Analytica who manipulate people into political extremism.
The problem is that thanks to the wealth of information available about us online, these shady men and women have an easier job than ever before. They can use data science to drill down to our deepest insecurities. They can track our demographic data, browsing habits, and social media activity to gain a perfect understanding of just how to make us do what they want.
“Those with authoritarian sympathies might have received messages about gun rights or Trump’s desire to build a border wall,” Halpern continues. “The overly anxious and insecure might have been pitched Facebook ads and emails talking about Hillary Clinton’s support for sanctuary cities and how they harbor undocumented and violent immigrants. Alexander Nix, who served as CEO of Cambridge Analytica until March, had earlier called this method of psychological arousal the data firm’s ‘secret sauce.’”
If your skin crawled a little reading that statement, you’re not alone. It’s not a pleasant concept to confront by any means – the notion that we may be inadvertently dancing on someone else’s strings. That what we see online may be pushing us in subtle ways towards ideas that we might not otherwise consider.
It’s not like this sort of thing is without precedent, either. Facebook has a long history of treating the data of its users with something close to measured contempt. Take a 2012 experiment where the social network manipulated the news feeds of its users to see what impact it would have on their emotional state.
If I were more paranoid, I might even point to that as a predecessor to Cambridge Analytica.
The good news is that with the European Union’s General Data Protection Regulation leading the charge, governments across the world are clamping down on companies that might abuse our privacy. Before long, we’ll move into a future where individual users have greater ownership over how and where their personal data is used. Even so, that’s no guarantee that our data will be safe.
After all, Facebook openly violated its own terms of service by working with Cambridge Analytica. And there will always be businesses that don’t play by the rules – businesses that don’t respect regulatory guidelines. On top of that, it’s going to be some time before these regulations are both widely-accepted and fully-enforceable.
Regulations aside, a few things are going to need to change in order for us to cut down on the kind of manipulation we’ve described here – and here’s where your business comes in.
- TOS documents that are easier to read and understand. As it stands now, the majority of privacy policies and contracts people are forced to read through are overly-convoluted, written more for legal professionals than for end users.
- Greater transparency in terms of data collection and usage, with a focus on end-user consent. This may make certain marketing processes more difficult, but in the long run, giving people better ownership over their information will make them likelier to choose your brand over its competitors.
- Better data monitoring and data security tools, including an EFSS solution that ensures files passing outside the corporate firewall are still under an organization’s control.
- A better understanding of cognitive liberty – that is to say, the right for people to control their own preferences, choice, beliefs, and political leanings.
We live in strange times. A few years ago, an organization like Cambridge Analytica would seem like fiction – like something you’d see in a dystopian fantasy or spy thriller. But that organization, and others like it, very much exist.
Awareness of that is the first step in moving away from the dark side of data science, and towards the positive changes it can bring into the world.