Colleges urged to bolster creative side of students' digital literacy
November 17, 2017
Students know how to consume digital content, but need more help learning to create and use it in the workplace, NMC study says.
The app’s accidental streaming of inappropriate content exposes holes in the system’s screening process — and leads to calls for human intervention.
Kate Roddy is a contributing writer at Scoop News Group, parent of EdScoop....
YouTube Kids has been a big hit with families since its launch in 2015, but it's raising new concerns for parents whose children are being exposed to graphic and indecent content on the video streaming app.
According to recent complaints, inappropriate content is sneaking past the app’s filters, creating an unsafe viewing experience for young children and highlighting the dangers of relying solely on artificial intelligence to screen content.
Staci Burns, the mother of a 3-year-old boy named Isaac, told the New York Times that she became aware of the app’s graphic content after her son approached her and told her that he was scared of the monsters showing up on his iPad.
Burns thought her son had been watching a clip from “PAW Patrol,” a Nickelodeon show popular among young children. In reality, Isaac was watching renderings of the show’s characters in a video titled “PAW Patrol Babies Pretend to Die Suicide by Annabelle Hypnotized,” which depicts a demon-possessed doll from the film “Annabelle” and the beloved characters of PAW Patrol dying in graphic and disturbing scenes.
“My initial response was anger,” Burns told the newspaper. “My poor little innocent boy, he’s the sweetest thing, and then there are these horrible, horrible, evil people out there that just get their kicks off of making stuff like this to torment children.”
Burns says this discovery was particularly troubling because of the app’s otherwise positive impact for Isaac. According to Burns, YouTube Kids helped her son learn colors and letters before other children his age.
Other parents have complained of sexually inappropriate content on the app, saying their children came across videos featuring popular TV and movie characters engaging in lewd behavior and visiting strip clubs. The BBC reported earlier this year on similar fake content.
Malik Ducard, global head of family and learning content at YouTube, responded to parents’ horror, stating that YouTube recognizes the severity of the issue and is striving to make the app completely family-friendly.
Ducard described the screening measures for YouTube Kids’ content, explaining that the site uses algorithms to filter through videos uploaded to YouTube and determine which videos are suitable for children. These videos are then continually monitored in a process that is “multi-layered and uses a lot of machine learning,” said Ducard. He also added that the company encourages parents to report inappropriate videos, which employees at YouTube will then manually review.
Over the past month, “less than .005 percent” of the millions of videos viewed in the app were removed for inappropriate content, Ducard told the Times.
Many of the videos most recently flagged by parents were uploaded by anonymous users with channel names such as Kids Channel TV and Super Moon TV. The videos target the app’s 11 million weekly viewers by featuring familiar TV characters and including terms and phrases like “education” and “learn colors” in their titles.
“Algorithms are not a substitute for human intervention, and when it comes to creating a safe environment for children, you need humans,” said Josh Golin, executive director of the Campaign for Commercial-Free Childhood, which filed a complaint with the Federal Trade Commission in 2015 accusing YouTube Kids of deceptive marketing to parents based on inappropriate videos.
While many parents are abandoning the app altogether, some parents, like Burns, still see YouTube Kids as a beneficial learning and entertainment tool. They are realizing now, though, that they will need to supervise their children's use of the app more closely.