In March 2018, Facebook was once again under fire for being a company that allows too much data mining. This time, data from 50 million Facebook likes users was obtained and used by Cambridge Analytica, the former campaign consultant of Trump, to influence elections. The New York Times and The Guardian were the first to […]
In March 2018, Facebook was once again under fire for being a company that allows too much data mining. This time, data from 50 million Facebook likes users was obtained and used by Cambridge Analytica, the former campaign consultant of Trump, to influence elections. The New York Times and The Guardian were the first to report this, and the news caused Facebook shares to plummet by 7% in only one day, the worst single-day decline since 2014.
Even before this turmoil began, researchers had found out in 2013 that Facebook likes can automatically and accurately predict very personal attributes of users including:
Computers are used to record, analyze, and look for subtle patterns of online behavior that are tracked by Facebook likes. Companies normally create third-party apps that users link to their Facebook accounts. This allows such companies to market and sells their products more accurately. We’re talking about cosmetic products, appliances, diapers, and other consumer products and services. Netflix is one example—it offers a personalized viewing experience by recommending shows and movies based on how the user’s viewing behavior fits with that of other users.
Cambridge Analytica used the same data mining techniques, but it isn’t selling juicers or diapers. It used the data for political ad targeting. Facebook said that Cambridge inappropriately gathered information from 270,000 people who downloaded a personality test app. These users had intended to share their data for research purposes and not for political or election manipulation.
Aside from the data that the people knowingly shared, Cambridge was able to get hold of their friends list and what their friends like, enabling the company to analyze 50 million Facebook profiles.
Chris Wylie, who co-founded Cambridge, left the company in 2014, and later on became a whistleblower, disclosing that Cambridge gathered user data and created an information cocoon which targeted changing people’s perceptions of what’s happening around them.
“This is based on an idea called ‘informational dominance,’ which is the idea that if you can capture every channel of information around a person and then inject content around them, you can change their perception of what’s actually happening,” Wylie said in an interview on NBC’s Today program.
Wylie further claimed that Cambridge Analytica’s goal is to “explore mental vulnerabilities of people,” and that political ads are targeted at specific voters. Cambridge makes it a point not to let these people know that the messages they are getting are intended to influence their views.
Trump’s campaign team had denied using Cambridge’s data for the 2016 elections, although President Donald Trump’s campaign chairman and adviser was the firm’s former vice president, Stephen Bannon.
Cambridge admits to having helped Texas Republican Senator Ted Cruz win in the Iowa caucus. With the data gathered by the firm and from Cruz’s campaign app, Cambridge found out that automated red light cameras are an important issue to residents who didn’t like government intrusion. Direct messages that say Cruz is against red light cameras were sent to voters who lived near them.
Cambridge also tailored messages with regards to gun rights, according to the firm’s CEO, Alexander Nix. They used personality types to do this so that users classified as introverts may have received messages that say guns are for protection against crimes. People who value tradition may be made to think that fathers and grandfathers should be able to continue giving shooting lessons.
US and EU officials investigated Cambridge’s improper data collection, but Facebook and Cambridge Analytica were not too cooperative. This pushed Britain’s information commissioner to seek a warrant so that they could access the servers of Cambridge Analytica, which is a British firm.
Facebook had always made it easy for advertisers to target users, especially with its mapping of the “social graph” which reflects the real-life connections of everyone on Facebook. The video below explains how social graph (or open graph, as Facebook calls it) works:
According to Jonathan Albright, the research director of Tow Center for Digital Journalism at Columbia University, advertisers and researchers can look at the variation of a user’s friends, and they’ll know which friends are influential. Facebook users who bridge different clusters of friends have more influence when they post something, and they are the ideal target.
Facebook and other social media sites have become the primary source of news of the majority of people, and these companies should be responsible for the day-to-day dissemination of correct information instead of fake information that can manipulate users.
Date: November 6, 2019 / Categories: Interesting, / Author: Disturbinsilence