“I Share Some Caution on this [Vaccine] Because We Just Don’t Know the Long-Term Side Effects of Basically Modifying People’s DNA and RNA.”
February 16, 2021
February 16, 2021
Facebook is the foremost platform where individual users alongside mainstream and alternative news media share news and perspectives. Now the links shared on Facebook will be promoted or suppressed based on what the social media giant calculates as their “trustworthiness” or lack thereof. The new program is costing “billions” of dollars, which is being spent on both artificial intelligence and “tens of thousands of human moderators.” Facebook’s efforts are allegedly intended to combat “fake news” and “deliberate propaganda, especially in elections.”
Facebook CEO Mark Zuckerberg said Tuesday that the company has already begun to implement a system that ranks news organizations based on trustworthiness, and promotes or suppresses its content based on that metric.
Zuckerberg said the company has gathered data on how consumers perceive news brands by asking them to identify whether they have heard of various publications and if they trust them.
“We put [that data] into the system, and it is acting as a boost or a suppression, and we’re going to dial up the intensity of that over time,” he said. “We feel like we have a responsibility to further [break] down polarization and find common ground.”
Zuckerberg met with a group of news media executives at the Rosewood Sand Hill hotel in Menlo Park after delivering his keynote speech at Facebook’s annual F8 developer conference Tuesday.
The meeting included representatives from BuzzFeed News, the Information, Quartz, the New York Times, CNN, the Wall Street Journal, NBC, Recode, Univision, Barron’s, the Daily Beast, the Economist, HuffPost, Insider, the Atlantic, the New York Post, and others.
The event, called “OTR” (shorthand for “off the record”), is an annual gathering meant for new media news executives to talk shop. It is in its second year. Zuckerberg’s remarks were initially meant to be, like the name of the conference, off the record, but he agreed to answer questions on the record.
Even if it does not continue to overtly sideline conservative content, Facebook’s AI-guided censorship squad faces an uphill battle with an American public that is deeply divided along political lines, and 54% of whom believe Facebook and Twitter are part of the problem, according to a recent Gallup/Knight Foundation study.
Politico summarizes the paper’s findings thus,
Americans have a negative view of the media, believe coverage is more biased than ever and are sharply divided in their views along partisan lines.
The study’s sponsors and academic observers fault conservatives and supporters of President Trump for creating distrust of corporate media by subscribing to “‘Trump’s cult of personality credo that anything or anyone critiquing him is not real.’” Of course, such a view simplifies Trump’s stance toward the unmistakable, across-the-board negative coverage of his campaign and administration from almost all quarters of the major news media.
One might wish to consider how alternative perspectives and conflicting information might be shared on Facebook or similar platforms once such “trustworthiness” guidelines are in play and in the wake of a complex public event, such as a mass shooting or terrorist attack. One can gather from past instances that major media coverage and commentary are primarily devoted to publicizing the official, state-sponsored narrative. Anomalies and contradictions will be deemed “untrustworthy” and accordingly tossed down the digital memory hole.