Half of teenagers in the Midlands exposed to harmful content on social media

Stock image
Stock image

Almost half of teenagersaged 13-17 who use social media have seen posts that they believe should not be allowed, new research published this week by the Chartered Institute of Marketing (CIM) has revealed.

The survey of over 2,500 adults and teenagers published ahead of the close of the Government’s consultation on online harms, shows that 95 per cent of young people aged 13-17 have a social media account, with the most popular being YouTube (79 per cent), followed by Instagram (73 per cent), Snapchat (66 per cent) and Facebook (45 per cent).

The study shows that despite many children coming across potentially harmful posts on social media platforms, very few are doing anything about them.

Almost two thirds of teenagers who have seen content they think they shouldn’t have, say they either rarely or never report these posts. Only seven per cent say they always do.

Seeing this type of content does seem to be discouraging youngsters from using social media.

Almost half agree that they would be put off from engaging in discussion and conversations online, but very few are prepared to give up their accounts; two in three said that seeing posts on social media that should not be allowed would not make them want to delete their account, while more than half (52%) said it would not put them off signing up for an account in the first place.

Who is responsible?

When it comes to who should be protecting children from harmful or inappropriate content on social media, the public place responsibility on parents and social media companies.

Three quarters of people over 18 say it is the responsibility of parents/guardians and social media companies to protect children on social media.

Across the Midlands, 28 per cent of adults say they have seen harmful content online in the last six months, with 47 per cent never or rarely reporting it.

However, most people believe strongly that social media companies should be removing harmful content from social media.

Revenue from marketing and advertising is the main source of income for most social media companies and the Chartered Institute of Marketing believes more must be done to protect users on social media if UK businesses are to continue to spend their marketing revenues reaching customers through social media platforms.

Dr Paul Connor, Midlands Chair at CIM, said: “We are calling on the government to launch a social media marketing campaign to educate people on the importance of reporting harmful content and arming them with the tools to do so.

The number of children who have seen inappropriate posts on social media and failed to report them is nothing short of alarming, and while more adults do report harmful content, it is concerning that only one in five adults frequently do so.

Social media companies will soon have a legal responsibility to act once harmful content has been flagged, but we don’t believe we should wait for the regulations to take effect. If we take the simple action of hitting the report button, we could all make a huge difference now.”

The research also demonstrates the prevalence and impact of harmful content being seen by adults on social media:

- Younger adults are much more likely to recall seeing harmful content than older generations; 46 per cent of 18-24 year olds say they had seen it in the last six months, compared with only 16 per cent of those aged 55 and over.