Canada Markets closed
  • S&P/TSX

    20,595.89
    +4.89 (+0.02%)
     
  • S&P 500

    4,349.93
    -6.52 (-0.15%)
     
  • DOW

    34,168.09
    -129.61 (-0.38%)
     
  • CAD/USD

    0.7870
    -0.0025 (-0.3219%)
     
  • CRUDE OIL

    86.83
    -0.52 (-0.60%)
     
  • BTC-CAD

    45,909.89
    -1,701.69 (-3.57%)
     
  • CMC Crypto 200

    821.93
    -33.89 (-3.96%)
     
  • GOLD FUTURES

    1,815.10
    -14.60 (-0.80%)
     
  • RUSSELL 2000

    1,976.46
    -27.57 (-1.38%)
     
  • 10-Yr Bond

    1.8480
    +0.0650 (+3.65%)
     
  • NASDAQ futures

    13,970.50
    -188.00 (-1.33%)
     
  • VOLATILITY

    31.96
    +0.80 (+2.57%)
     
  • FTSE

    7,469.78
    +98.32 (+1.33%)
     
  • NIKKEI 225

    26,321.33
    -689.97 (-2.55%)
     
  • CAD/EUR

    0.7010
    -0.0010 (-0.14%)
     

Three in four girls have been sent sexual images via apps, report finds

·3 min read
<span>Photograph: Nick David/Getty Images</span>
Photograph: Nick David/Getty Images

Study finds over half of teenagers sent non-consensual sexual images via social media apps reported it


Schools and parents should do more to support students who are being sexually harassed through platforms such as Snapchat and Instagram, while the tech companies need to clamp down on non-consensual sexual images being sent to young people, according to new research released on Monday.

The study by academics at University College London and the University of Kent found that just over 50% of teenagers who had been sent unsolicited sexually explicit images via social media apps say they have not reported the offences to either their parents, authorities or the companies involved.

The report highlights the technological functions and lack of accountability and identity-checking on platforms such as Instagram, and criticises app reporting functions as “useless,” meaning that young people are more likely to just block offenders rather than report the abuse.

Children’s safety groups have warned that the UK data watchdog must introduce age verification for commercial pornography sites or face a high court challenge over any failure to act.

Related: ‘I felt my body wasn’t good enough’: teenage troubles with Instagram

Asked why they didn’t report incidents involving sexual images, about a third of the young people surveyed by the researchers answered: “I don’t think reporting works.” Just 17% of those who received unwanted sexual content reported it to the platforms involved.

Prof Jessica Ringrose of the UCL Institute of Education, one of the report’s authors, said: “Young people in the UK are facing a crisis of online sexual violence. Despite these young people, in particular girls, saying they felt disgusted, embarrassed and confused about the sending and receiving of non-consensual images, they rarely want to talk about their online experiences for fear of victim-blaming and worry that reporting will make matters worse.

“We hope this report allows all of us to better identify when and how image-sharing becomes digital sexual harassment and abuse and spread the message that, although the non-consensual sending and sharing of sexual images may be common and feel normal, it is extremely harmful.”

The study surveyed 480 young people aged 12 to 18 from across the UK, including 144 who participated in focus groups. Over half of those who had received unwanted sexual content or had their image shared without their consent reported doing nothing. Just 25% told a friend, but only 5% told their parents and 2% told their schools.

Of the 88 girls who took part in the focus groups, three-quarters said they had received images of male genitals. They said that close to half of the harassment had come from what appeared to be adult men, including adults who had created false identities. They also received online harassment and abuse from boys in their age range and peer groups.

A spokesperson for Meta, the holding company formerly known as Facebook which operates Instagram, said the safety of young people using its apps was its “top priority”. “If anyone is sent an unsolicited explicit image, we strongly encourage them to report it to us and the police,” the spokesperson said.

A spokesperson for Snapchat said: “There will always be people who try to evade our systems, but we provide easy in-app reporting tools and have teams dedicated to building more features, including new parental tools, to keep our community safe.”

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting