YouTube is trying to bridge the gap between its dedicated app for kids and regular YouTube for parents with teens and teens.
YouTube announced Wednesday will launch a new “supervised” beta experience that will introduce additional features and settings to regulate the types of content that older children can access on the platform. Content will be restricted based on the selection of one of three categories. “Explore” will introduce videos suitable for children 9 years of age or older, “Explore More” will make them a category with videos for children over 13, and “Most YouTube” will show them almost everything except the restriction of age and issues that might be sensitive to non-adults.
YouTube says it will use a combination of machine learning, human review and user input for veterinary content, a system that does it worked spectacularly for YouTube in the past. Apparently, try to get ahead of any issues that arise from yours blocked moderation system, the ad unit stated that YouTube knows “that our systems will make mistakes and continue to evolve over time.”
Clearly, any tool that attempts to filter inappropriate content on YouTube is welcome and necessary. But guardians can’t just rely on YouTube to take the wheel and guide their children’s experience. We have seen it how well it has worked in the past, in the dedicated Kids app on YouTube, that is, it’s not fantastic.
Part of the problem is that the YouTube platform, like those of other social media giants, is too big to moderate properly. A wrong turn can make your child fall to rabbit hole of conspiracies whether they were looking for them or not. Also, if we’re being honest, teens and teens will probably find a way to see the content they want to see, regardless of the child protection your home computer has.
G / O Media may receive a commission
All that has been said is to create a middle ground between YouTube Kids and normal YouTube chaos. Just don’t count on a perfect moderation system. Even YouTube says so.