Letting YouTube’s algorithm babysit your kids can be child abuse, argues Sean Sullivan, F-Secure Security Advisor.
“You wouldn’t put your kids in front of a television showing subliminal advertising,” he told me. “You shouldn’t let a bunch of people who only care about maximizing ad dollars babysit your kids. If you think your kids can outsmart the greatest ad-serving company ever to exist, that’s a bet I don’t want to make.”
Sullivan compares YouTube to a “rabbit hole” where kids can quickly find all sorts of disturbing content before their eyes. These concerns have been echoed in recent months by security researcher Steve Lord and writer James Bridle who was so disturbed by the site’s automated delivery of videos that depict violence and abuse that he’s been forced to “question my own beliefs about the internet, at every level.”
Youtube has responded to the reports documenting abuse of its platform by announcing it would, among other steps, add new moderators in addition to using “cutting-edge machine learning more widely to allow us to quickly and efficiently remove content that violates our guidelines.”
Sean called out a specific line of the message to underline how difficult it will be for Youtube to improve its moderation given the scale required of its goal of taking on taking on television:
Some advertisers definitely care about the content their ads are associated with but for many other, the cheapest possible clicks are the only concern. And while high-quality content can bring in millions of views, people are willing to stare at just about any car crash, especially a child with no parent around to supervise.
As an ad-driven platform, YouTube has also began to hit content producers that post offensive materials where it hurts most — in the ad revenues. “According to estimates, accounts shut down for potential child endangerment content could’ve been netting over $500,000 a month,” Buzzfeed reports. By doing this YouTube is surrendering close to half a million dollars of its 45 percent cut.
But Sean isn’t sure it’s even possible for YouTube to do enough because of the massive scale of the site’s ambitions, which he questions really even fits within the company’s once stated mission.
“I can see why an open source project like Android fits with the mission of organizing the world’s information. People need an inexpensive way to access the world’s information.”
YouTube says it wants “to give everyone a voice and show them the world.” And kids are definitely being shown the world. Maybe too much of the world.
“I’m not sure the business model is compatible with kids, especially compared to Amazon Prime and Netflix kids,” he said. “If you’ve got the choice of relatively inexpensive video that’s carefully moderated, why would you go to YouTube?”
Sean does let his son use the video site but only when he’s being supervised.
“To him, Youtube is a huge Minecraft tutorial.”
His son uses the YouTube iPad adult app with content filtering on and auto-play and ad suggestions off. He also erases the web history frequently to avoid building up a profile for advertisers.
Sean also uses the same settings for himself.
“My parental guidance is to live by the same guidance as my kid,” he said.
And while he trusts himself and other adults to be able to pull themselves out of YouTube’s rabbit hole, he doesn’t want to put his kid in that position.
If you’re interested in ways that you cant monitor your child’s behavior online more like the way you do in the real world, check out our new Family Rules toolset.
After F-Secure principal security consultant Tom Van de Wiele stepped into the #CyberSauna for the second episode of…
January 19, 2018