home page

Meta: AI caught creating obscene images of Indian-American celebrities, Meta wanted to know people's opinion before taking action

Meta: The monitoring board of social media giant Meta has sought public opinion on the action taken against obscene images created with the help of AI in two cases. These cases relate to celebrities in India and the US in which AI-generated morphed photos of Indian and US celebrities are being shared on Instagram and other platforms. To take a decision on this topic, the tech giant has sought the opinion of the public. Let us know the complete information in detail in the news... 

 | 
Meta: AI caught creating obscene images of Indian-American celebrities, Meta wanted to know people's opinion before taking action

HARYANA NEWS HUB (Bureau): “This picture resembles a famous personality of India and has been created using Artificial Intelligence (AI). The account from which this content was posted only shares AI-generated pictures of Indian women. Most of the users responding to this are from India, where the problem of deepfakes is increasing. Let us know more information related to this in the article below... 


The board, which decides on content moderation, said one of the two cases involved an AI-generated image of a naked woman posted on Instagram. This image is of a likeness of a public figure from India, created using AI.

Delhi Crime : दिल्ली में एक युवक ने ASI पर बरसाईं गोलियां, फिर खुद को मारी गोली, जानिए क्या है मामला

The account that posted this content only shares AI-generated images of Indian women. Most of the responding users have accounts in India. Where deepfakes occur and it is a growing problem.

Delhi Crime : दिल्ली में एक युवक ने ASI पर बरसाईं गोलियां, फिर खुद को मारी गोली, जानिए क्या है मामला
Meta sought public opinion:
Meta has sought public opinion on the issue, although the Ministry of Electronics and IT has already asked social media firms to remove AI-generated fake images and videos and issued an advisory asking these platforms to strictly enforce their Have been asked to follow.


Fake or morphed photos and videos of several Indian actresses, including Rashmika Mandanna and Priyanka Chopra Jonas, have gone viral on multiple social media platforms including Instagram, Facebook and Twitter.

The board said that in the India-related case, a user reported the content to Meta for obscenity but the report was automatically closed because it was not reviewed within 48 hours.

Delhi Crime : दिल्ली में एक युवक ने ASI पर बरसाईं गोलियां, फिर खुद को मारी गोली, जानिए क्या है मामला

The board said that after this the user appealed to the board. As a result of the Board's handling of this case, Meta determined that its decision to leave the content in error was in error and removed the post for violating the Threats and Harassment Community Standard.

Delhi Crime : दिल्ली में एक युवक ने ASI पर बरसाईं गोलियां, फिर खुद को मारी गोली, जानिए क्या है मामला

Action is taken within 24 hours of complaint:
According to the IT Rules of 2021, an online platform is required to remove full or partial nudity within 24 hours of receiving a complaint. The board has also invited public views on a case in the US, where an AI-generated pornographic image of an American celebrity was posted in a Facebook group.

Delhi Crime : दिल्ली में एक युवक ने ASI पर बरसाईं गोलियां, फिर खुद को मारी गोली, जानिए क्या है मामला


The board said that most of the users who responded to the post have accounts in the US. In this case, the image had already been deemed a violation of Facebook's Community Standards and removed.


The public comment window for this matter is open for 14 days till April 30, the board said. The Oversight Board did not mention names or details of the celebrities.

Delhi Crime : दिल्ली में एक युवक ने ASI पर बरसाईं गोलियां, फिर खुद को मारी गोली, जानिए क्या है मामला