Surprise! Russia’s fake news campaigns aren’t swaying opinions and elections — you are.  

See, while Facebook’s Mark Zuckerberg is being called to task by Congress to answer for the so called “data breach” perpetrated by Cambridge Analytica — after it harvested and categorized 50 million Facebook users without their knowledge — statisticians and psychologists are still out there mapping America’s attitudes and feelings with greater accuracy than ex-lovers.

It happens when you pop online and take a seemingly harmless survey like, “What Kind of Condiment Are You?” (pickle, btw.) Each answer allows for psychological masterminds with powerful servers and complex algorithms to capture data about who you are as a person.

And Facebook’s made no secret that it targets ads and posts which it thinks you’d be most interested in, driving traffic to those pages to amplify the company’s revenue potential. This is common sense, sure.

But it isn’t anything new either, really. Marketers have been studying what makes you buy since the days cavemen were bartering two goats for a mate. Yet it's the accuracy of today’s data that's cause for alarm. “If the goal is to get you to purchase a particular brand of toilet paper or soda, well it could be chalked up to strategic capitalism,” says Angela Bourne, psychology consultant for Colorado Association of Commerce and Industry. “When that data is used to change votes or influence ideology — even when information is based on facts — the result can be catastrophic. It’s how we ended up with someone as unqualified as Donald Trump.”

The study of you has gotten remarkably intricate and detailed and even sinister. For instance, third party software might notice that you liked Taylor Swift’s page. You did it one evening on the W line because you were bored, drunk and wanted to follow her Twitter feud with Kanye West or maybe see her skimpy Grammy Awards dress.

What you probably didn’t notice was that researchers had written elaborate code to make some creepy accurate profile assumptions about you. For instance, based on your Taylor Swift "like," you were branded as very extroverted. The data also categorized you as concerned about what others think and empathetic. And you’re probably not as open-minded as most, according to data.

Years ago, researchers at Stanford University, in partnership with Cambridge University, developed a first version Facebook app called myPersonality. The app evolved into the more invasive app called Apply Magic Sauce. The data collected was mined to determine a person’s openness, conscientiousness and even neuroticism. The 100 question quiz was taken by more than 7.5 million people who willingly authorized researchers to gain access to their Facebook profile data — all allowable per Facebook’s terms of service at the time.

The data gathered from the social media profiles was then correlated to the answers given on the quiz. Within seconds, remarkably fast data analysis servers were drawing up alarmingly accurate snapshots of real people based solely on their "likes" — no quiz necessary.

The algorithm then began to compare us to each other, making categorical generalizations which, when taken in the totality of our unique social media profiles, made a menu of our traits for sale to anyone willing to pay for the information. Our personalities became an incredibly valuable commodity. For instance, the data revealed that someone who likes Fight Club was far more likely to be open to new experiences than someone who liked American Idol.

It seems arbitrary, but was it accurate? Yes. Yes it was.

To test this accuracy, computer science professor Michal Kosinski from Stanford provided a 10 question version of the quiz and asked 32,000 of the original respondents to give that quiz to friends and family. The shorter version asked friends, family, spouses and co-workers specific questions about the respondent.

Based on the answers provided by real human beings who had personal interactions with the respondent, the model determined that with just 10 likes, it could predict the respondents’ attitudes more accurately than a co-worker could. With 70 likes, it was more accurate than a roommate or best friend; and with 300 likes, it could predict behavior better than a spouse.

Kosinski said the model was surprisingly accurate when “predicting life outcomes such as substance abuse, political attitudes and physical health.”

Researchers believe their algorithms can predict your political ideology better than people with whom you share intimate relationships. Armed with the knowledge of how pliable you are, companies such as Cambridge Analytica can populate your Newsfeeds with stories written in such a way to sway opinions based on how you process information and how you interact with the world.

“We find your voters and move them to action,” the firm had emblazoned on it’s website.

It’s easy to blame Russian trolls for swaying the election with misinformation and perpetuating divisiveness today. It’s easy to blame fake news for producing stories that haven’t been verified from sources that haven’t been vetted. It’s easy to casually blame the stupidity of everyone else on social media. But before anyone starts pointing accusatory fingers at others, we ought to think back to our first Facebook profile pictures — and start with the man (or woman) in the mirror.