Tech companies are at a stalemate with fake news and conspiracy theories

Alex Jones

Screenshot/Infowars

  • Social media companies have tried several approaches to tackle fake news, but they're not stamping out the problem.

  • YouTube's latest attempt to tackle the misinformation issue is to include text from Wikipedia on conspiracy theory videos.

  • But Wikipedia isn't designed to debunk conspiracy theories, and often has its own accuracy issues.

  • Companies often use "window dressing" to indicate that content is fake rather than actually removing it.



It's no secret that social media sites like Facebook, Twitter, and YouTube have a problem with fake news and conspiracy theories. If you were to rely on them to get your news, you may end up with a distorted view of current events.

Recently, videos accusing Parkland school shooting survivor David Hogg of being an actor were featured prominently on YouTube and Facebook. YouTube featured a false video about Hogg at the top of its trending chart, and Facebook linked to misleading videos about him in its trending module.

Large technology companies have tried countless ways to fight back against fake news and conspiracy theories.

Facebook tried adding a big, red "disputed" tag to stories but ditched that feature after it realised that the tag actually made people click on false stories even more.

Twitter emailed nearly 678,000 users to inform them that they may have interacted with accounts linked to a Russian propaganda factory.

Now it's YouTube's turn to try to fix the problem. The company has come up with a new way to try to stop fake news and conspiracy theories proliferating on its platform: Adding in some text from Wikipedia below the videos.

Let's say you're on YouTube late at night and stumble upon a video about chemtrails. Take "'Chemtrails' — How They Affect You and What You Can Do." The video claims that planes flying overheard are causing everything from autism to polluting the water.

At the moment, there's nothing on the video to warn you that it's an unscientific conspiracy theory. You might even believe it and write a song about chemtrails, as the video suggests.

But YouTube's new plan will likely put text from Wikipedia underneath the video, informing you that chemtrails are a conspiracy theory.

Wikipedia isn't the solution to fake news

YouTube's idea of adding in text from Wikipedia underneath conspiracy theory videos seems like a good compromise. The site isn't a partisan news source, and so everyone will take its word as truth, right?

Well, no. Wikipedia gets things wrong too. In December, Wikipedia seemed to cause Apple's virtual assistant Siri to tell people that actor John Travolta had died. The site even maintains its own list of hoaxes that started on Wikipedia.