Faced with mounting scrutiny over some of the content on its platform, YouTube has started reporting how widely viewed some of these suspect videos actually are.
The new Violative View Rate (VVR) represents the number of views a rule-breaking video gets – whether in full or in part – expressed as a percentage of total views. The content includes videos violating the company’s child safety policies, violent or graphic content, nudity and sexual content, spam or hate videos.
The VVR will, says YouTube, form the primary metric by which the company measures its performance in terms of responsibility.
“Other metrics like the turnaround time to remove a violative video are important. But they don’t fully capture the actual impact of violative content on the viewer,” says Jennifer O’Connor, YouTube’s director of trust and safety.
“For example, compare a violative video that got 100 views but stayed on our platform for more than 24 hours with content that reached thousands of views in the first few hours before removal. Which ultimately has more impact?”
Right now, she says, the VVR stands at between 0.16 and 0.18 per cent, meaning that of every 10,000 views on YouTube, 16 to 18 come from violative content. This is down by over 70 per cent compared with the same quarter of 2017, with YouTube crediting machine learning for the improvement.
The company says it’s calculating the rate by sampling videos at random for checking by its content reviewers. And, it points out, every time its content policies are updated, the rate’s likely to rise for a short period while the system works to catch up.
YouTube has come in for particular criticism for the way its recommendation system can lead viewers further and further down the rabbit-hole of extremist content.
But, it says, since investing heavily in machine learning four years ago, it is now able to detect 94 per cent of all violative content through automated flagging, with three quarters removed before receiving even 10 views.
Since the publication of its first Community Guidelines Enforcement Report in 2018, YouTube says it’s removed more than 83 million videos and seven billion comments for violating its policies.
It’s important to note, though, that these figures apply only to the content that YouTube content reviewers deem violative – and many believe that the company is far too lenient.
And with the platform receiving billions of views every day, even a low VVR can mean that millions of people are exposed to content that breaches the rules. However, with this new move, the company goes some way to convincing lawmakers that it’s doing all it can to police its platform effectively.