• Follow us on:

Instagram makes a major change to protect users from receiving unwanted nude photos in their direct messages

Tech

1 months ago
Share on:

The Meta-owned app is releasing a tool called 'nudity protection' that blurs naked photos received via direct messages (DMs).

It uses AI to automatically identify male or female genitals in sent photos before obscuring them, similar to technology that already exists in dating apps.

However, the receiver still has a chance to tap and see the photo properly, even if they're as young as 13 – which one children's charity has called 'entirely insufficient'.

Instagram requires everyone to be at least 13 years old if they want to create an account – already criticised by experts and the public alike as far too young.

However, sending an unsolicited photo of genitalia constitutes 'intimate image abuse'.

What's more, 'sextortion scammers' who receive intimate images may threaten to share them unless a ransom is paid by the unfortunate sender.

According to an investigation earlier this year, 100 children a day are falling victim to sextortion scams on social media, including the 16-year-old boy who killed himself.

'To help address this, we’ll soon start testing our new nudity protection feature in Instagram DMs, which blurs images detected as containing nudity and encourages people to think twice before sending nude images,' Meta says.

According to a Meta, which is owned by billionaire Mark Zuckerberg, the new tool will be rolled out in the coming months.

Nudity protection will be turned on by default for 13 to 17-year-old users globally, but Instagram will show a notification to users aged 18 and above encouraging them to turn it on too.

When nudity protection is active and someone receives a nude photo, the photo will be automatically hidden – but the recipient will have the opportunity to tap 'view photo'.

They'll then see the photo heavily blurred with the warning message, 'photo may contain nudity' – and they'll then be able to tap 'see photo' to see it unblurred.

They can also tap to block the sender and report the chat, or tap another button to see further safety tips.

And if they try to forward a nude image they’ve received, they'll see a message encouraging them to reconsider.

For the person sending the nude, the process will be different.

The photo will be blurred for them too in the chat thanks to the automatic AI nude detection technology, but the sender will get a different pop-up message.

The message will read: 'Take care when sharing sensitive photos... others can screenshot or forward your photos without you knowing.'

It will also let them unsend the nude photo if they’ve changed their mind, but will tell them 'there's a chance others have already seen it'.

They'll also be given the option to see the safety tips, which have been developed with guidance from experts, Meta said.

Instagram DMs use end-to-end encryption, which ensures only the two participants of the chat can see messages – making it easier for pedophiles to go undetected from law enforcement agencies according to some experts.

UK children's charity the NSPCC said end-to-end encryption on Instagram and other Meta platforms like WhatsApp and Messenger means the new tool is 'entirely insufficient'.

'More than 33,000 child sexual abuse image crimes were recorded by UK police last year, with over a quarter taking place through Meta’s services,' said Rani Govender, senior policy officer at NSPCC.

'Meta has long argued that disrupting child sexual abuse in end-to-end encrypted environments would weaken privacy but these measures show that a balance between safety and privacy can be found.

'However, the new measures will be entirely insufficient to protect children from harm and must go much further to tackle child abuse.

'It is now inconceivable for Meta to continue down the route of willfully losing the ability to identify and disrupt abuse through their decision to implement end-to-end encryption without safeguards for children in place.'

'A real sign of their commitment to protecting children would be to pause end-to-end encryption until they can share with Ofcom and the public their plans to use technology to combat the prolific child sexual abuse taking place on their platforms.'

Instagram has a strict policy against posting naked photos on the main element of the app – the photo feed – although this policy doesn't seem to apply to DMs.

It includes the banning of photos and videos of female nipples, but it does make exceptions for women who are 'actively breastfeeding'.

Unfortunately, several OnlyFans stars have been using this loophole to post videos of dolls held up to their exposed breasts, pretending to breastfeed.

Users slammed Instagram for not noticing the issue, with one saying it makes a 'complete mockery of mothers everywhere'.  

Earlier this year, Instagram finally started hiding posts that can pose serious harm to children, including those related to suicide, self-harm and eating disorders.

It follows serious concerns about how teens are affected by social apps, including 14-year-old Instagram user Molly Russell who killed herself in 2017.

Social media giants like Instagram, Facebook, X and TikTok made an estimated $11 billion a year from users under 18

The giants of social media generated $10.7 billion in advertising revenue in 2022, all from their US users who are 17 years old or younger, a study shows.

Harvard researchers found that about 20 percent ($2.1 billion) of revenue from Facebook, Instagram, Snapchat, X and YouTube came from children 12 and under.

The other $8.6 billion came from users aged 13 to 17.

The study published Wednesday has claimed to provide the first estimate of how much money this handful of companies makes by snaring young internet users.

source: Daily Mail UK