LONDON -- Instagram has agreed to ban graphic images of self-harm after objections were raised in Britain following the suicide of a teen whose father said the photo-sharing platform had contributed to her decision to take her own life.
Instagram chief Adam Mosseri said Thursday evening the platform is making a series of changes to its content rules.
He said: "We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable in our community."
Mosseri said further changes will be made.
"I have a responsibility to get this right," he said. "We will get better and we are committed to finding and removing this content at scale, and working with experts and the wider industry to find ways to support people when they're most in need."
The call for changes was backed by the British government after the family of 14-year-old Molly Russell found material related to depression and suicide on her Instagram account after her death in 2017.
Her father, Ian Russell, said he believes the content Molly viewed on Instagram played a contributing role in her death, a charge that received wide attention in the British press.
The changes were announced after Instagram and other tech firms, including Facebook, Snapchat and Twitter, met with British Health Secretary Matt Hancock and representatives from the Samaritans, a mental health charity that works to prevent suicide.
Instagram is also removing non-graphic images of self-harm from searches.
Facebook, which owns Instagram, said in a statement that independent experts advise that Facebook should "allow people to share admissions of self-harm and suicidal thoughts but should not allow people to share content promoting it."