January 25, 2023
YouTube is disabling comments on most videos that include children and minors, in response to renewed brand-safety concerns from advertisers.
The dramatic changes to the platform’s commenting system came after a number of high-profile advertisers, including Disney, Hasbro and AT&T, pulled all ads from YouTube.
The move followed reports that …
The solution is dead simple: “... Those creators will be required to actively moderate their comments, beyond just using our moderation tools, ..."
If the content creators or the platform hosts want to include a "clean" comments function, their only option is to do the work required themselves. Pre-approval, done by actual human moderators, before comments can be posted is by far the best way to help ensure a civil comments function.
Relying on any "bad post" reporting system is almost useless, no matter how quickly the offending post is identified and removed. As the article states, the moderation must be active and preemptive, and not reactive.If the site doesn't want to spend what's required to moderate comments, then they have no reason to complain when their comments sections are shut-down by the platform owners.
There are solutions available now, today, for influencers and brands. Respondology offers a tool to remove any toxic comments – hate, predatory or other – on YouTube and Instagram as protection. It also can surface who the bad actors are. Filtering technology + 1k U.S. based moderators make it happen. And it’s a personal mission of mine, President of Respondology, and father of two boys online. We’re here to help. Respondology.com or Erik@respondology.com