Blue Whale Challenge has extended its claws in India as well. So far it has claimed 7 lives in different states of the country. The online suicide game deaths started in India when a class 10 student in Mumbai jumped to his death from a multi-storey building. On Thursday, Fortis Healthcare announced a 24x7 helpline for those who are going through stress and anxiety due to the Blue Whale Challenge. Now Facebook has come up to curb the deaths linked to this online suicide game. It is working with suicide prevention partners to track phrases, hashtags and group names associated with online challenges encouraging self-harm or suicide.
"We offer resources to people that search for these terms on Facebook," the social media giant said.
The Blue Whale Challenge provokes and psychologically manipulates the player to indulge in self-harm through 50 tasks which stretches for a span of 50 days. The game is concluded when the player ends his own life.
Facebook said it also removes content that violates our Community Standards, which do not allow the promotion of self-injury or suicide. Starting on World Suicide Prevention Day on September 10, Facebook said it would also connect people in India with information about supportive groups and suicide prevention tools in News Feed.
"Facebook is a place where people connect and share, and one of the things we have learnt from the mental health partners and academics we have worked with on this issue, is that being connected is a protective factor in suicide prevention," said Ankhi Das, Director of Public Policy for Facebook in India, South and Central Asia.
Additional resources about suicide prevention and online well-being will also be added to its Safety Center, Facebook said.
With these resources, people can access tools to resolve conflict online, help a friend who is expressing suicidal thoughts or get resources if they are going through a difficult time.
"We care deeply about the safety and millions of people in India who use Facebook to connect with the people who matter to them, and recognise there's an opportunity with these tools and resources to connect someone who is struggling with a person they already have a relationship with," Das said.
"We have teams working around the world, 24/7, who review reports that come in and prioritise the most serious reports like suicide. For those who reach out to us, we provide suggested text to make it easier for people to start a conversation with their friend in need," Facebook said.
"We provide the friend who has expressed suicidal thoughts information about local help lines, along with other tips and resources," it added.
Facebook’s Safety Center provides supervision to parents, teenagers, educators and law enforcement establishments to start a conversation about online safety, with localised resources and videos available online. Netizen can also reach out to Facebook if they saw anything that makes them worried about a friend’s well-being.
(With IANS Inputs)
For more Lifestyle news and updates, follow our Facebook page