[ad_1]
Meta, the parent company of Facebook and Instagram, acted against over 1 crore content pieces related to child endangerment in terms of nudity, physical abuse and sexual exploitation in 2023.
According to Meta’s India monthly reports under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, the period from January 2023 to December 2023 saw action against 1,21,80,300 crore such content pieces.
The company said: “We measure the number of pieces of content (such as posts, photos, videos or comments) and we take action for going against our standards. This metric shows the scale of our enforcement activity. Taking action could include removing a piece of content from Facebook or Instagram or covering photos or videos that may be disturbing to some audiences with a warning.”
From January 1 to December 31, Facebook’s data showed that a total of 4,681,300 pieces of content related to ‘Child Endangerment – Nudity and Physical Abuse’ and ‘Child Endangerment – Sexual Exploitation’ faced action.
Meanwhile, Instagram saw a higher number, with 74,99,000 pieces of content facing action during the same period, including a significant portion pertaining to sexual exploitation, with 4 million instances just in January last year.
According to Meta: “In July 2018, we updated our methodology to clarify how many discrete pieces of content we’ve taken action on for violating our policies, and we will continue to mature and improve our methodology as part of our commitment to providing the most accurate and meaningful metrics. Overall, our intention is to provide an accurate representation of the total number of content items that we take action on for violating our policies.”
However, it should be noted that recently Mark Zuckerberg, CEO of Meta, joined executives from Twitter, now rebranded as X, and TikTok in appearing before the US Senate Judiciary Committee. This appearance comes amid rising concerns among US lawmakers and parents regarding the impact of social media on the lives of young people.
In terms of India, the central government has been clear about where they stand as the aim is to ensure that the internet in India is open, safe and trusted as well as accountable to all the Digital Nagriks.
Rule 4(2) of the IT Rules, 2021 requires major social media platforms to assist law enforcement agencies in identifying the originator of information linked to various sensitive matters, including national security, foreign relations, public order, and offences like rape, sexually explicit content, or child sexual abuse material (CSAM).
Minister of State for the Ministry of Electronics and IT Rajeev Chandrasekhar in his parliamentary response stated: “The IT Rules, 2021 cast specific legal obligations on intermediaries, including social media intermediaries and platforms, to ensure their accountability towards safe & trusted Internet including their expeditious action towards the removal of the prohibited information which are obscene, pornographic, paedophilic, invasive of another’s privacy including bodily privacy, etc. including any misinformation, patently false information and deepfakes.”
“In case of failure of the intermediaries to observe the legal obligations as provided in the IT Rules, 2021, they lose their safe harbour protection under section 79 of the IT Act and shall be liable for consequential action or prosecution as provided under any law for the time being in force including the IT Act and the Indian Penal Code such as undersections292 and 293of the IPC,” he added.
[ad_2]
Source link