About once a day, while I’m scrolling through TikTok or Instagram in a state of euphoric online dissociation, I’ll come across a video that’s so stupid that it briefly makes me consider putting my phone down and doing some actual work (I never do, but still).
I’m not talking about your run of the mill stupid videos either, like of somebody falling down or doing an embarrassing little dance. I’m talking about those “hack” videos where a person tells you that you should try cooking steak in a toaster, or bedazzling your tongue with sequins.
Those videos that look like something Neil Buchanan would make if he lost his mind, where a person encourages the viewer to do something completely pointless that at best won’t work, and at worst could harm them.
As frustrating as those videos are, it’s that potential for harm that I find the most disturbing about them. Not only is this misinformation often completely unregulated, but it is also actively promoted – and in some cases protected – by the websites that host it.
Last week, a YouTuber by the name of Ann Reardon, whose channel How To Cook That has become popular in recent months for debunking fake viral video trends, found that one of her videos had been removed by YouTube for violating the site’s policy against harmful and dangerous content.
The video in question was titled “Debunking DEADLIEST craft hack, 34 dead”, and in it Reardon discussed the viral DIY phenomenon of fractal wood burning, where a strong electrical current is passed through a damp plank of wood to create unique and visually interesting burn patterns. There are a number of videos online that teach people how to build the device needed to create the patterns, which involves dismantling an old microwave and repurposing the transformer.
According to Reardon, the phenomenon has caused more than 30 reported deaths by electrocution in the US, 30 in the UK, and several in her home country of Australia. As Reardon discussed in her follow up video, she initially believed that her video of her had been removed in a blanket effort to remove all mention of fractal wood burning from the site: an action she supported even if it meant losing views on her own channel .
However, she discovered that while her own video warning of the dangers of the practice had been removed, the tutorials that her video was criticalizing had been allowed to remain. While her video of her was eventually reinstated, the entire episode highlights a huge issue not just with YouTube, but with online content in general.
Reardon provides a valuable public service with her videos, but it’s a service that shouldn’t need to be provided in the first place. Why does it fall to a YouTuber to hold this content to account, even after it has caused dozens of preventable deaths? You can’t show a boob on TV before a certain time in the UK without Ofcom coming down on you like the Rapture, but kids can access a video that may as well be called “How to kill yourself quickly and effectively” between illegally uploaded Paw Patrol episodes.
I think a big part of the problem with trying to argue this point is that people have begun to confide censorship and free speech with basic health and safety standards. It’s difficult to argue for the removal of internet content, because there are so many groups with a vested interest in keeping the internet loosely regulated.
We live in an age of unfettered propaganda, and it is an age in which many people are thriving. On an internet with a regulatory body that takes decisive action against misinformation, you wouldn’t have incidents like in 2019, when the Conservative Party press office briefly rebranded as FactCheckUK to trick people into believing it was a neutral fact-checking body during that year’s election. Cambridge Analytica would be much more limited in its involvement with the Brexit referendum. Donald Trump would probably be kicked off Twitter in 2009, five minutes after creating his account of him.
Perhaps the biggest thing keeping the misinformation engine running, though, is us. These ridiculous hack videos are dumb, but people seem to keep watching them. Troom Troom, a channel that once made a video for children full of dangerous advice about how to escape a kidnapping, has 23 million followers. 5-Minute Crafts, which once released a video advising people to bleach strawberries and use them as food decoration, has 77 million.
While writing this article, I checked Snapchat twice! I saw a video of a woman straining pasta into a toilet because “that’s how the Italians do it”! What is wrong with me? Am I sick?
The problem is that for the vast majority of us, the internet is a dopamine machine that we plug ourselves into without a second thought for the quality of the content we’re being fed. It’s like the Matrix, except that instead of being controlled by super advanced robots, it’s run by unsupervised 17-year-olds with too much time on their hands. We consume a thousand little hits of information a day, and it’s impossible to fact check all of them, so we move on to the next thing without dwelling on the previous one too much. “Maybe you can make steak in a toaster,” you tell yourself. “What am I, a meat scientist?”
To keep up to speed with all the latest opinions and comment, sign up to our free weekly Voices Dispatches newsletter by clicking here
I don’t even think a lot of these videos are being made maliciously. I’m pretty sure they’re just created by people who are competing in an entertainment landscape that never sleeps, and therefore requires constant output. There are only so many interesting things in the world that you can make videos about; maybe we’ve just started to run out, so these content farms have had to resort to making things up instead. That doesn’t excuse harmful content, but it does go some way towards explaining why it exists in the first place.
In a world that refuses to regulate this type of content, maybe it’s on us to be more discerning about what we consume. Maybe you should think twice before sharing that too-good-to-be-true life hack. Maybe you should supervise your kids’ internet use a little more closely. Maybe I shouldn’t have checked Snapchat for a third time before finishing this article.
Above all else, maybe we shouldn’t trust these content creators to hold themselves to account, or worse, to expect individuals like Ann Reardon to step up and hold them to account for us. There is no incentive for them to self-regulate, or to ensure that their content adheres to any kind of standard of quality; it is our job to give them one.