One in three video-sharing users come across hate speech, according to Ofcom.
The regulator has set out new guidance to help video sharing platforms (VSPs) keep users safer, saying that its approach balances protection of users with upholding freedom of expression.
VSPs established in the UK – such as TikTok, Snapchat, Vimeo and Twitch – are required by law to take measures to protect under-18s from potentially harmful video content; and all users from videos likely to incite violence or hatred, as well as certain types of criminal content.
Ofcom research shows that a third of users say they have witnessed or experienced hateful content; a quarter claim they’ve been exposed to violent or disturbing content; while one in five have been exposed to videos or content that encouraged racism
Unlike in the regulation of broadcasting, Ofcom’s role is not to assess individual videos. Instead it’s working with the platforms so they can fully understand their responsibilities. This includes providing clear rules around the uploading of content including the prohibition of the uploading of content relating to terrorism, child sexual abuse material or racism which are criminal offences. There should also be easy reporting and complaint processes and companies should implement tools that allow users to flag harmful videos easily.
It is also a requirement to restrict access to adult sites. VSPs that host pornographic material should have robust age-verification in place, to protect under-18s from accessing such material.