close
Thursday November 21, 2024

Most child abuse stuff uploaded on social media from India

By Hafiz Oneeb Rashid
December 20, 2023

BRUSSELS: Last year, the highest number of images and films of child abuse on social media was uploaded from India, while major three Islamic countries — Bangladesh, Pakistan and Indonesia — took third to fifth slot in the list.

This has been stated in a report issued from European police headquarters, EUROPOL, in The Hague. The report has been published by Politico, a global nonpartisan politics and policy news organization, launched in Europe in April 2015,

A man uses a smartphone Facebook, Instagram, and WhatsApp, can be seen on the screen. —  AFP/File
A man uses a smartphone Facebook, Instagram, and WhatsApp, can be seen on the screen. —  AFP/File

In 2022, 5.67 million films and images of child sexual acts were uploaded from India, while it was 2.76 million from Philippines, 2.15 million from Bangladesh, 2.1 from Pakistan and 1.8 million from Indonesia.

According to Europol sources, this list includes the 20 largest contributor countries in the world, in which the regions or countries following the first 5 countries are the United States, the European Union, Vietnam, Iraq, Mexico, Algeria, Colombia, Brazil, Saudi Arabia, Thailand, Egypt, United Kingdom, Venezuela, Turkey and UAE are included. According to Europol, two countries within the European Union, France and Poland, uploaded the most content containing child abuse.

According to the report, social media platforms Facebook and Instagram reported 21.17 million and 0.55 million obscene images or content to national authorities respectively.

In 2022, social media giant Meta Platforms found and reported 26 million images on Facebook and Instagram. Teenagers’ favorite apps Snapchat and TikTok respectively filed over 550,000 and nearly 290,000 reports to the U.S. National Center for Missing and Exploited Children, an organization acting as a clearing house under U.S. law for child sexual abuse material (CSAM) content that technology firms detect and spot.

The European Commission in December also ordered Meta to explain what it was doing to fight the spread of illegal sexual images taken by minors themselves and shared through Instagram, under the EU’s new content-moderation rulebook, the Digital Services Act (DSA). The gravity of the pedophilia crisis deepened when someone uploaded pitures of molestation of a newborn girl. A group of international detectives tried to identify details — a toy, a clothing label, a sound — that would allow them to rescue the girl and arrest those who sexually abused her, recorded it and then shared it on the internet.

Even a tiny hint could help track down the country where the baby girl was assaulted, allowing the case to be transferred to the right police authority for further investigation. Such details matter when police are trying to tackle crimes carried out behind closed doors but disseminated online across the world.

Finding and stopping child sex offenders is gruesome and frustrating most of the time — yet hugely rewarding sometimes — police officers part of the international task force at the EU agency Europol told POLITICO.

Offenders are getting better at covering their digital tracks and law enforcement officials say they don’t have the tools they need to keep up. The increasing use of encrypted communication online makes investigators’ work harder, especially as a pandemic that kept people at home and online ramped up a flood of abuse images and videos.

In 2022, social media giant Meta Platforms found and reported 26 million images on Facebook and Instagram. Teenagers’ favorite apps Snapchat and TikTok respectively filed over 550,000 and nearly 290,000 reports to the U.S. National Center for Missing and Exploited Children, an organization acting as a clearing house under U.S. law for child sexual abuse material (CSAM) content that technology firms detect and spot.

The European Commission in December also ordered Meta to explain what it was doing to fight the spread of illegal sexual images taken by minors themselves and shared through Instagram, under the EU’s new content-moderation rulebook, the Digital Services Act (DSA).

Politicians across the world are keen to act. In the European Union and the United Kingdom, legislators have drafted laws to dig up more illegal content and extend law enforcement’s powers to crack down on child sexual abuse material.

Europol created a database in 2016 and this system now holds 85 million unique photos and videos of children, many found on pedophile forums on the “dark web” — the part of the internet that isn’t publicly searchable and requires special software to browse.