Meta (formerly Facebook) has announced the development of a new safety tool aimed at blocking children from receiving and discouraging them from sending nude images, even in encrypted chats, expected to be launched later this year.
The tool is likely to be optional and available to adults on Instagram and Facebook.
This comes in response to criticism from government and police after Meta introduced default encryption for Messenger chats, raising concerns about its impact on detecting child abuse.
The new feature is designed to protect users, particularly women and teenagers, from the sharing of explicit images.
Meta also revealed that minors will, by default, be unable to receive messages from strangers on Instagram and Messenger.
The company faces scrutiny, with legal filings alleging that 100,000 teenage users of Facebook and Instagram are harassed online daily.
Despite the move to encryption, Meta insists its new feature is not client-side scanning and will use machine learning to identify nudity, working entirely on the device.
The company emphasizes its commitment to child safety, citing over 30 tools introduced to address the issue.
Additionally, parental supervision tools will now allow parents to deny teenagers’ requests to change default safety settings.
BBC