Suicide Instagram group Created By Teenage Girls as Young as 12 was uncovered by Police lately
Police have finally been able to discover a secret group on Instagram named ‘Suicide‘. It involves 12 girls, between the age of 12 and 16 years across southern England.
The mission of these young girls in the group is to plan to have “suicidal crises” and “serious self-harm”.
The police investigation reportedly uncovered the online group when 3 of these girls got missing but were later found, however, in a seriously bad state in London.
These girls had traveled by rail to converge in London.
They were found in seriously bad condition in a street and were, reportedly, taken by ambulance to hospital for emergency treatment.
One of the girls actually confessed that they initially met each other online and that was where they were discussing suicide, a briefing published on 25th March stated this.
Police officers then scrutinized digital devices to get the name of the online group and also its other members.
Seven out of the twelve girls had “self-harmed” before they were being traced by the police. Children’s social care services from various local authorities( seven to be precise) have taken part in safeguarding children who have been identified as group members.
Police had mentioned in a statement that “peer-to-peer influence boosted suicidal ideas amongst kid involved, even to the point that many get intensified to suicidal crises and serious self-harm.”
Instagram has claimed it has found no evidence of the platform rules being broken as it uses Artificial Intelligence (AI) to capture and eventually block self-harm posts and groups.
Suicide Instagram group created by teenage girls- They are also on other social media platforms
Some of these children had also met on other social media platforms but were in a closed Instagram group, a direct message thread whose title used the words like “suicide” and “missing”.
Facebook, the owner of Instagram, admitted that the name of the closed group referenced “suicide”, however, it says it has not been taken out from the platform the fact that the content of the messages breaks none of its rules.
In a statement, Instagram spokesperson said it was working hand-in-hand with the police.
“Reports have been reviewed and, so far, have seen no evidence of the said people going against our rules around suicidal and self-hurt act.
“We disallow graphic content, or content promoting or encouraging suicidal or self-hurt act. Such contents will be removed when found.”
“We will continue to give support to the law enforcement agencies and also available to honour any valid legal request for more information/details.”
Facebook and other social media platform need to step their system of spotting these kind of adversaries as it seems many users have found ways to be smarter than AI. More human involvement won’t be a bad idea.
What’s your suggestion?