We can't always stop people saying nasty things, says Instagram boss
By Jamie Harris, Press Association Science Technology Reporter
Instagram's boss has said it cannot tackle the issue of bullying alone, as the platform continues to grapple with balancing what people say and share.
The Facebook-owned app has been under the spotlight over cyberbullying and images of self-harm across the social network, the latter of which was banned in February following the death of teenager Molly Russell, whose family found material relating to depression and suicide when they looked at her Instagram account after she took her life in 2017.
Adam Mosseri, head of Instagram, told BBC Radio 1's Newsbeat that bullying is broader than its platform but admitted that public criticism the company receives is "healthy" to tackle the issue.
"We definitely think about what Instagram might be like for the average man or woman," he said.
"We don't want people getting depressed on our platform but we can't stop people from saying mean or nasty things sometimes, so it's a balance.
"Generally, we're trying to figure out how to do more to nurture the positive uses."
Mr Mosseri also responded to comments from US singer Selena Gomez, who recently said she does not keep Instagram on her own phone because she started to feel "depressed" from it.
"I wouldn't extend Selena Gomez's experience of the platform to what your experience or my experience might be like," he continued.
"She has over 100 million followers, it's a whole other world."
Instagram is currently exploring whether to hide the number of likes content has publicly, in a bid to ease some of the pressure users feel on the platform.
"We generally want Instagram to be less of a pressurised environment, we don't want people to compete," Mr Mosseri added.
"We need to make sure that creators like her are getting value out of the platform, that they don't get depressed by the platform.
"But the tools that we need to develop for a 15-year-old boy or 14-year-old girl are very different."
Social networks have come under increased scrutiny about their effects on young people and the vulnerable, with the Government putting forward plans to address a host of online harms under the watch of a regulator.
A White Paper was published earlier this year, proposing strict new rules be introduced that require firms to take responsibility for their users and their safety, as well as the content that appears on their services.
Child sexual abuse and exploitation, harassment, cyberstalking, and hate crime are among a list of areas the Government wants to be legally overseen by an independent regulator, after deciding that social networks and web giants can no longer be relied on to self-regulate.