LONDON: Children in Britain stumble on violent content online, including material promoting self-harm, while still at primary school and say it is an “inevitable part” of using the internet, according to research published on Friday.
The report underlines the challenge facing world governments and tech groups, such as Meta META.O, which owns Facebook, Instagram and WhatsApp, Google’s GOOGL.O YouTube, Snap Inc’s Snapchat SNAP.N, and ByteDance’s TikTok, to enact safeguarding measures, especially for minors.
Britain passed legislation last October that set tougher rules for social media platforms, including a mandate for them to prevent children from accessing harmful and age-inappropriate content by enforcing age limits and age-checking measures.
The law gave Ofcom the power to fine tech companies if they fail to comply with the new requirements, but the penalties have not yet come into force as the regulator must produce codes of practice to implement the measure.
Messaging platforms led by WhatsApp have opposed a provision in the law they say could force them to break end-to-end encryption.
All of the 247 children, aged between 8-17, interviewed for the report - commissioned by Ofcom and carried out between May and November - came across violent content online mostly via social media, video-sharing and messaging sites and apps, Ofcom said.