We’re on a sinking ship, but it’s not settling on the bottom just yet …

Most everyone in the world now has immediate access to breaking news. The downside? A lot of it is false, and that's steering society's ship into dangerous waters.

According to the Pew Research Center, a large "majority of U.S. adults, 62 percent, get news on social media — and 18 percent do so often." That includes all social media platforms, but mostly Facebook. It is, by far, the largest online resource for content. That's frightening, considering it's now embroiled in controversy over the dissemination of fake news.

This week, the social media giant announced it would be working with fact-checking outlets moving forward in an effort to combat just that.

Sites that have signed on to tackle this adventure include Snopes, FactCheck.org, Politifact, ABC News and the Associated Press.

But Facebook is also being very clear about its intentions here. It wants to stay as far away as possible right now from calling itself a ‘news’ media outlet. What goes up on a user’s feed is still completely up to the account’s owner, and can be from either side of the political coin.

“We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully,” Facebook said in the statement. “We’ve focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third party organizations.”

It continues by outlining four areas the site plans to focus on moving forward, tweaking them along the way to provide more accuracy.

Easier Reporting – Facebook will “make it easier to report a hoax if you see one, which you can do by clicking the upper right hand corner of a post.”
Flagging Stories as Disputed – “If the fact checking organizations [Facebook is working with] identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why.”
Informed Sharing – Facebook has “found that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way. [They are] going to test incorporating this signal into ranking.”
Disrupting Financial Incentives for Spammers – Facebook is “doing several things to reduce the financial incentives” like tracking down spoof sites and not allowing spam to buy ads.

The changes began yesterday (Thursday, Dec. 15). The site says it will take a few days for every user to see flags where one can put a story into question.

However, it’s been over a month since Donald Trump was elected president, a span in time many are questioning because of the role Facebook played in dispensing bogus stories. Problem is, it isn’t so easy as to just start deleting accounts on a user’s behalf.

The company continually walks a fine line between being a third-party host and a driving force in social manipulation. By its actions, it’s clear this is an area Facebook doesn’t feel comfortable being in.

CEO Mark Zuckerberg even initially shrugged off the claim that Facebook had anything to do with tilting the election in Trump’s favor, stating: “Personally I think the idea that fake news on Facebook, which is a very small amount of the content, influenced the election in any way — I think is a pretty crazy idea.”

He eventually softened his stance, leading the company to pursue these very partnerships with fact-checking sites — all dedicated to flushing out false claims in phony media.

However, ‘fake news’ is an absurdly giant stroke in subjectivity. What one person may deem a complete representation of reality, another could find nonsensical, even dangerous. On Facebook’s part, it will be relatively easy to find news sources fabricating stories (i.e. nbcnews.com.co). Though it likely won’t do much good. Trusted sources get the facts wrong too, most rarely taking any accountability for doing so when caught.

Take a recent article published by 7 News in Denver, CO — “CBD hemp oil named a Schedule I drug by DEA” is a piece by Deb Stanley in response to public outcry over a move the Drug Enforcement Agency recently made regarding cannabis extracts.

There’s, in fact, three blatant errors in the article* (read more about the situation here for a clearer picture about what’s going on). This is something Facebook’s new plan will never pick up on, instead, the way its algorithms are set, the social media site may actually help it go viral considering the source it’s coming from and the engagement levels it currently has (over a thousand shares).

That’s unfortunate, because there are smaller media outlets (ourselves included) that got it right, but don’t necessarily have the reach as others who were making (and continue to make) mistakes. (To its effort, The Cannabist posted a thorough look into the story as well. Except, it wasn’t posted until 4 p.m. MST, well after the damage was already in full viral-mode by other outlets.) In this case, false information is being distributed with ease — and there are no checks and balances to the contrary. 

Stanley wasn’t alone in the misinformation, either. Many national outlets used the same errors to build stories, resulting in tens of thousands of shares. It frightened many parents of children who use CBD oils for seizures. The story ruined a lot of days throughout America, and it shouldn’t have.

The information was completely false. This is also an example of ‘fake news.’

Zuckerberg and his team have an exceptionally unique and important problem on their hands right now. For one, they must appease the masses, which is always an uphill battle no matter how it’s framed. Two, there are over a billion users on the platform daily that rely on its diligence in providing the most appropriate content for keeping people informed — that’s an unprecedented task with no easy solutions.

We don’t envy them in this position.

But then the question comes into play: Whose job is it to monitor information? Are Zuckerberg and Facebook responsible for the content that everyone posts, or does every user have an inherent obligation to check sources and seek out reputable content on their own? As such a powerful tool of communication, does the government and the FCC charge its way in to start playing Big Brother? Would we even let them?

It’s safe to say at this point that no one has any idea of what’s going on. This is a problem that will take years to fix, if it can even be fixed. What’s best is for everyone using social media as a main source of information to be diligent in seeking the truth — not just a narrative that fits personal comfort zones.

We’re on a sinking ship, but it’s not settling on the bottom just yet.

*7 News Article Errors:

-“A cannabidiol hemp oil used by some families of children with seizures has been officially named a Schedule I drug by the DEA.” (As per the DEA, it’s always been scheduled as such, and anyone who uses it or has used it in the past was breaking federal law.)
– “That puts the oil in the same category as heroin, LSD, ecstasy, meth, peyote and marijuana.” (Methamphetamine is actually a Schedule II drug, and is used in a variety of medicines worldwide.)
-“The change explains that the DEA is creating a new Administrative Controlled Substances Code Number for what it calls ‘Marihuana [sic] Extract.’” (The [sic] was unnecessary, as the DEA uses marihuana with an ‘h’ because of historical purposes.)