“To achieve their objectives, the planners of psychological warfare campaigns first attempt to gain total knowledge of the beliefs, likes, dislikes, strengths, weaknesses, and vulnerabilities of the target population. According to the CIA, knowing what motivates the target is the key to a successful PSYOP. [An Introduction to Psychological Warfare]”
Think about that the next time you click “Like.”
Not only are Big Data companies collecting our likes and dislikes, they are creating behavioral profiles of users to determine vulnerabilities and weaknesses. Knowing what motivates a target audience is key to a successful PSYOP campaign, but it’s also the key to a successful marketing campaign. It’s the morality of how that information is gathered that is at question.
Used to be that marketers paid consumers to participate in focus groups and asked their permission to poll for preferences. Marketers had to work for their insights, they had to get permission to gather them, and they had to pay individuals for them.
Today, all that information is gathered and aggregated and stored in massive, centralized data centers. Much of it is gathered without our permission. And none of us are currently compensated for it. And worse, the data on each of us is used to build a profile, which becomes our digital identity, which then determines what we are served; not just in terms of ads, but in terms of news and other media.
Psychological warfare is the battle for the “hearts and minds” of a particular group of people to motivate them to behave in a certain way that benefits the manipulator.
The best minds of my generation are thinking about how to make people click ads, and that really sucks. (Jeff Hammerbacher, formerly of Facebook).
It’s almost refreshing to watch the impeachment hearings because of their old-school face-to-face political bullying. Psyops on a massive scale such as the one Cambridge Analytica launched using Facebook data should be terrifying.
The New York Times and The Observer of London (report) that Cambridge Analytica, a political data firm founded by Stephen K. Bannon and Robert Mercer, the wealthy Republican donor, had used the Facebook data to develop methods that it claimed could identify the personalities of individual American voters and influence their behavior. The firm’s so-called psychographic modeling underpinned its work for the Trump campaign in 2016, though many have questioned the effectiveness of its techniques.
We should be questioning the morality of the use of our data. In fact, we should be outright controlling every pixel of it. The EU is taking steps in this direction by creating General Data Protection Regulation, or GDPR. Article 25 introduces the concept of “privacy by design and default.” This is a huge shift from “opting out” of invasive features, to agreeing to share your data with third-parties. What we learned from the Cambridge Analytical over-reach is that privacy is personal and consent is sacred. The data on 50 million unsuspecting people was targeted to manipulate behavior. That should never happen in a democracy.
The right of digital privacy is explored in The Great Hack. Journalist Ben Kenigsberg had this to say: “This review of ‘The Great Hack’ is the first article that I’ve felt mildly concerned about emailing to my editors. Why am I even using the internet? Why is Twitter open on another tab? Wouldn’t it be smarter to disconnect, move to the woods and live off the land?”
The choice always has been, and always should be, an individual one. And rather than the longrifle of the Kentucky woodsman, our weapon should be the ability to own our digital identity and circumvent the central clouds of data that can so easily be used against us. Give me Digital Liberty or Give Me Death.