Susan Wojcicki, former advertising executive-turned-YouTube CEO, announced on March 13 that the world’s foremost online video-sharing platform will have a new feature called “information cues” on controversial videos that contradict information approved and distributed by government sources and mass media.
YouTube will introduce a new tool to combat online conspiracy theories in the coming weeks, the latest effort from Google’s video site to halt the spread of misinformation.
Videos propagating conspiracy theories about events, like the moon landing, will now be accompanied by text from Wikipedia providing facts that counter the theory,
“Our goal is to start with a list of conspiracies around the internet where there’s a lot of active discussion,” Wojcicki declared at the South by Southwest conference in Austin, Texas.
Following the announcement of YouTube’s new Cass Sunstein-style paternalism some have already argued that the feature doesn’t go far enough. For example, Wired writes that merely referencing links to official narratives and government conspiracy theories seeking to explain complex events likely won’t solve the problem when concerned citizens take to the internet to become more fully cognizant of what corporate-controlled news media reportage:
The recommendation system isn’t designed to ensure you’re informed; its main objective is to keep you consuming YouTube videos for as long as possible. What that entails has mostly been an afterthought. Even if every conspiracy video is served up with a Wikipedia article contradicting the information that it presents, there’s no guarantee that users will choose to read it over the video they’ve already clicked on.
Take, for example, what happens when you search conspiracy theorist Alex Jones’ videos about the Parkland shooting. After watching one, YouTube recommends you then watch another of Jones’ videos, this time about how the Sandy Hook shooting was a hoax. It doesn’t suggest that you watch factual clip about Parkland or Sandy Hook at all.
Yet what constitutes “factual” today isn’t determined by a body of truly independent fact checkers, but rather the officialdom standing behind a podium who is often found responding to pre-scripted questions. Matsakis rightly concludes, “Wikipedia in particular can also be edited by anyone, and its own reliability issues of misinformation [sic].”
In fact, YouTube’s efforts at providing “information cues” to at once reinforce the legitimacy of official narratives and mass double-think overlooks (perhaps purposefully) the fact that Wikipedia itself is manipulated by powerful political forces who seek to reshape historical and current events in ways that reflect and serve their own interests, as the videos below suggest.