Didn’t Detect Any Child Sex Abuse Material On Platform: YouTube To Centre

YouTube has submitted its formal response on the issue.

New Delhi:

YouTube on Monday said it did not detect materials related to child sexual abuse on its platform despite multiple probes and also has not received evidence of such content on the video streaming platform from regulators.

The statement from a YouTube spokesperson came after the government issued notices to social media platforms, including YouTube, X (formerly Twitter) and Telegram, earlier this month asking them to take down child sexual abuse material from their platforms in India.

In a statement, a YouTube spokesperson said: “We have a long history of successfully fighting child exploitation on YouTube. Based on multiple thorough investigations, we did not detect CSAM on our platform, nor did we receive examples or evidence of CSAM on YouTube from regulators.” 

The video platform owned by Google further said that “no form of content that endangers minors is allowed on YouTube, and we will continue to heavily invest in the teams and technologies that detect, remove and deter the spread of this content.” 

“We are committed to working with all collaborators in the industry-wide fight to stop the spread of child sexual abuse material (CSAM),” the YouTube spokesperson added in an e-mail statement.

YouTube has submitted its formal response on the issue.

In Q2 2023, YouTube removed over 94,000 channels and over 2.5 million videos for violations of child safety policy.

According to YouTube, in India, it shows a warning at the top of search results for specific search queries related to CSAM. This warning states child sexual abuse imagery is illegal and links to the national cyber crime reporting portal.

The government, on October 6, said notices have been issued to social media platforms X (formerly Twitter), YouTube and Telegram to remove child sexual abuse material from their platforms in India.

Minister of State for Electronics and IT, Rajeev Chandrasekhar had warned that if social media intermediaries do not act swiftly, their safe harbour status under section 79 of the IT Act would be withdrawn, implying that the platforms can be directly prosecuted under the applicable laws and rules even though the content may have not been uploaded by them.

“Ministry of Electronics and IT has issued notices to social media intermediaries X, YouTube and Telegram, warning them to remove Child Sexual Abuse Material (CSAM) from their platforms on the Indian internet.

“The notices served to these platforms emphasise the importance of prompt and permanent removal or disabling of access to any CSAM on their platforms,” the statement by the government on October 6 had said.

The notices also called for the implementation of proactive measures, such as content moderation algorithms and reporting mechanisms, to prevent the dissemination of CSAM in the future.

(Except for the headline, this story has not been edited by The Hindkesharistaff and is published from a syndicated feed.)

Leave a Reply

Your email address will not be published. Required fields are marked *