Instagram announced a new Schools Partnership program designed to help educators report instances of potential cyberbullying, a move in response to rising criticism about the negative impact social media is having on the health of young people.
Cyberbullying concerns from educators at participating schools will automatically be prioritized for review, and educators who report the problems will be notified if the company acts on their requests. Schools in the program—which was announced March 25—will also receive resources on how to help students navigate the internet safely.
Bullying is a major concern for parents as well educators. A Pew Research Center survey of parents conducted in the fall of 2022 found that nearly three-quarters of parents said they were either very or somewhat concerned about their child being bullied, up from 60 percent in 2015.
Sameer Hinduja, the co-director of the Cyberbullying Research Center, a nonprofit organization, noted that about 1 of every 6 teenagers report to their schools that they have been cyberbullied. That is the case even though they are often targeted by classmates.
One reason: Educators aren’t in the position to do much about cyberbullying. Instagram’s partnership program changes that, Hinduja said.
“This system will now fast-track solutions that students and educators want to see, so that everyone can get back to living and learning as soon as possible,” Hinduja said in a statement.
But Merve Lapus, the vice president of CommonSense Media, a research and advocacy organization focused on youth and technology, said the move isn’t necessarily going to suddenly make Instagram, which is owned by Meta, a child-friendly place.
Giving schools a streamlined process for reporting bullying incidents doesn’t mean that the platform will suddenly shift its business model from one developed primarily to keep users on the site to one “designed for kids with their development in mind. That’s not what this is,” Lapus said.
California law may have prompted Instagram to act, expert suggests
The development of the program may be, at least in part, about moving toward compliance with a new California law, Lapus said. That measure, enacted last year, requires social media platforms to acknowledge cyberbullying concerns raised by parents, guardians, or school administrators within 36 hours. Platforms must also take action within a certain timeframe, typically 30 days.
The announcement “does show that state-level legislation can drive some meaningful policy change [by social media companies],” Lapus said.
Platforms like Instagram can take steps that would be more beneficial to students, such as changing their algorithms so that tweens and teens can see posts they are actually interested in, as opposed to content that might be damaging to their mental health, Lapus added.
Documents released in 2021 through a whistleblower revealed that Meta—which owns Facebook and Instagram—conducted extensive research that showed a negative impact of its platforms on children’s mental health and the spread of false information. But it failed to act on any of those findings.
“Young boys who want to be more fit and look up exercise shouldn’t fall deep into the deep negative side of the manosphere,” said Lapus, referring to websites that promote misogyny and attack feminism. “Young girls who are interested in video creation shouldn’t be prompted into things like makeup and beauty, if that’s not what they’re looking for.”