Social Media Algorithms: A Guide for Parents and Schools

 

At Yipiyap, we've always believed in the power of young people helping young people. For over 12 years, we've seen first-hand how peer support can transform lives. But we've also seen how the digital world our students navigate is becoming increasingly complex – and sometimes concerning. A recent BBC investigation brought this into sharp focus. When they created social media accounts for fictional teenage boys, something troubling happened: within just hours, these profiles were bombarded with violent content. Street fights and aggressive videos began flooding their feeds. As an organisation that works closely with thousands of young people, this hit close to home.

It's why we're more passionate than ever about creating safe, positive spaces for young people to learn and grow. Through our new app Peerscroller, we're bringing our peer-to-peer approach into the digital age. Instead of letting algorithms decide what content young people should see, our brilliant peer mentors create TikTok-style videos about everything from sex and relationships, study skills to digital wellbeing – all fact-checked and designed to inform, not overwhelm.

Although Peerscroller can help, we know that one app isn't the whole solution - it’s a combined effort to support our young people. In this guide, we'll break down how social media algorithms work (in plain English!), and share practical steps that parents and schools can take to keep our children safe in the digital world. Because when it comes to online safety, knowledge and support are the best tools we can give our young people.

How do social media algorithms it work?

Think of social media algorithms as incredibly enthusiastic (but not always wise) friends who are desperate to show you more of what they think you like. Every time you pause on a video, like a post, or share something with friends, these algorithms are taking notes. "Aha!" they think, "they spent an extra second looking at that football clip - let's show them lots more sports content!".

According to Meta, their algorithms track every interaction – from the posts you like to how long you watch videos – to determine what appears in your feed.

It's a bit like a super-powered recommendation system. Here's what happens behind the scenes:

  1. They watch your every move: Not in a creepy way, but they do notice everything you do on the platform. How long you watch videos for, what you comment on, what you share, and even what makes you scroll quickly past. Every tiny interaction helps build a picture of what content you might want to see more of.

  2. They play matchmaker: Once they think they know what you like, algorithms start serving up similar content. Watched a few gaming videos? Get ready for lots more gaming content. Paused on a few workout videos? Prepare for your feed to become very fitness-focused!

  3. They work at lightning speed: BBC’s investigation highlighted that it took just hours for new accounts to be flooded with specific types of content. That's because these algorithms work incredibly quickly to build a profile of what they think you want to see.

The Attention Economy

According to the Youth Endowment Fund, at the heart of this issue lies the 'attention economy' – a system where user engagement is the primary currency. Social media platforms are designed to keep users scrolling, and unfortunately, violent content often proves to be highly engaging. This creates an incentive for algorithms to promote such material, especially to demographics deemed more likely to interact with it.

Why are boys targeted with violent content?

According to TikTok, their algorithm is not informed of the gender of the person using the app. But when you sign up to the App, you might see a screen that asks ‘What are you interested in?’ with icons popping up, like ‘Fashion’, ‘Funny videos’, ‘Makeup’, ‘Dancing’, ‘Cars’ or ‘Gaming’.

When you pick your interests, it can often have the effect of dividing the genders anyway. So, if someone was to pick that they are interested in fashion, as an example, the app would show them a flood of videos about that. 

When picking hobbies or interests popular with boys, you will get shown videos popular with other people with the same age and interests.

The targeting of boys with violent content isn't random. It's a reflection of deeply ingrained societal expectations and gender norms. Research has shown that boys, on average, tend to have lower levels of empathy compared to girls. This predisposition, combined with exposure to violent media, can create a dangerous feedback loop.

The Consequences

According to the UK Council for Internet Safety (UKCIS), the content young people regularly see online plays a significant role in shaping their attitudes and behaviors¹. Think about it like this: if you're constantly being shown videos of fights and aggressive behavior, it starts to feel... normal.

Here's what research tells us:

The "Is This Normal?” Effect: The American Academy of Pediatrics found that repeated exposure to violent content can lead to young people seeing aggression as a standard way to solve problems. We've had students tell us they thought everyone was seeing the same content they were – but that's not true at all!

The Pressure Cooker: Research from the University of Oxford's Internet Institute shows that social media algorithms can create 'echo chambers' that reinforce certain behaviors and attitudes³. For example, when algorithms push violent content, they can create this idea that being "strong" means being aggressive.

The Empathy Gap: A study in the Journal of Educational Psychology found that excessive exposure to violent content can affect young people's ability to empathise with others. We've seen how peer support helps build emotional understanding – we can imagine it’s hard to do that that when your feed is telling you the opposite!


What can schools do?

PSHE has also become a critical tool for fostering digital wellbeing, helping students understand and manage the unique challenges of the online world. Through lessons in digital literacy, cyber safety, and resilience, PSHE can empower students to make informed choices, protect their mental health, and use the internet in a safe and positive way.

  1. Digital Wellbeing

According to a study by the Journal of Media Literacy Education, students who receive structured media literacy education are better equipped to question the validity of online content, including violence-promoting algorithms.

One of Peerscroller’s key categories is Digital Wellbeing, with videos exploring the validity of online content and how algorithms work which can provide a starting point for classroom engagement. Facilitating discussions around these videos can encourage students to share their own experiences, ask questions, and develop a shared understanding of the challenges and opportunities presented by the digital world.

2. Health and Wellbeing

Research published by The Collaborative for Academic, Social, and Emotional Learning (CASEL) found that SEL can reduce aggression and improve emotional resilience in young people. PSHE can help students develop emotional regulation and improve their mental well-being, skills that are essential for navigating the challenging digital landscape.

Our health and wellbeing category provides appropriate strategies for emotion regulation, we have videos that guide students through techniques like breathing exercises, mindfulness practices, and journaling, empowering them to manage their emotional responses to online challenges.

Schools can also:

  • Teach strategies for resolving conflicts without resorting to aggression.

  • Use role-playing and group activities to foster empathy and understanding.

  • Provide access to counsellors and mental health resources for students struggling with emotional challenges related to violent content.

3. Encourage open dialogue

Creating an environment where students feel comfortable discussing their experience can be very helpful! This helps students process their feelings and promote understanding among peers. Research from Common Sense Media emphasises the importance of fostering open discussions with students about what they see online, enabling them to critically reflect on the violence they encounter.

  • Allow students to voice their concerns or experiences anonymously, fostering an environment of trust.

  • Host workshops for parents to discuss the impact of social media and how they can support their children at home.


What can parents do?

As a parent, you can play a crucial role in shaping your child's understanding and interaction with social media. By taking proactive steps, you can help your children navigate the digital landscape more safely and responsibly. Here’s our advice:

1. Understanding your child’s digital world

Familiarising yourself with the social media platforms your children use, including how these platforms operate and what content is shown, can be a great start.

  • Research the potential dangers of social media and understand how you can support your child. The NSPCC and UK Safer Internet are some examples of organisations offering social media webinars/resources to parents for free.

  • Understand privacy settings and parental controls

  • Speak to staff at school and see what resources they have available on this topic. Working together is key!

2. Establish open communication

Creating a safe space for your child to communicate honestly with you is vital. Encourage your child to share what they encounter, including any violent or distressing content.

  • Regularly ask your child to talk about their online activities and feelings.

  • Listen without judgement when they share their thoughts or experiences.

  • Instead of yes or no questions, encourage deeper conversations by asking questions that require discussion.

3. Set clear boundaries and guidelines

You can create and communicate clear guidelines for social media use, including time limits, acceptable content, and privacy settings. This can especially help boys understand expectations and navigate social media with more responsibility.

4. Encouraging healthy ‘offline’ activities

And finally, encourage young people to engage in offline activities such as sports, hobbies, and family time to create a balance between their online and offline worlds. These activities can provide positive outlets for energy and emotions, reducing the impact of violent content.

  • Plan regular family activities like hiking, visiting museums, or playing sports together.

  • Encourage them to join clubs or pick up hobbies like art, music, or coding.

  • Promote participation in sports or outdoor activities to release pent-up energy.


A Call to Action

Protecting young people from violent content driven by algorithms is challenging. However, by educating ourselves and them, we can reduce the impact. This needs cooperation from educators, parents, lawmakers, and the tech industry. By raising awareness, encouraging critical thinking, and promoting responsible technology, we can make the online space safer for our children. Let's work together to ensure that the algorithms shaping our young people’s digital worlds promote growth, empathy, and positive masculinity – not violence.

Our sources:

https://saferinternet.org.uk/blog/safer-internet-day-press-release-2021

https://www.bbc.co.uk/news/articles/c4gdqzxypdzo

https://www.ons.gov.uk/peoplepopulationandcommunity/wellbeing/bulletins/youngpeopleswellbeingintheuk/2020

https://youthendowmentfund.org.uk/violence-on-social-media-the-online-fight-for-our-childrens-attention/

https://www.bmj.com/content/310/6975/273

https://www.commonsensemedia.org/

https://pubmed.ncbi.nlm.nih.gov/11694708/

https://www.oii.ox.ac.uk/news-events/how-social-media-echo-chambers-emerge-and-why-all-your-friends-think-trump-will-lose/

https://www.apa.org/pubs/journals/edu

https://www.fosi.org/good-digital-parenting/how-to-talk-to-your-kids-about-social-media-algorithms

https://support.google.com/googleplay/answer/1075738?hl=en-GB

https://digitalcommons.uri.edu/jmle/vol13/iss3/8/

https://casel.org/fundamentals-of-sel/what-does-the-research-say/

https://www.commonsensemedia.org/sites/default/files/research/report/media-and-violence-research-brief-2013.pdf

https://www.nea.org/professional-excellence/just-equitable-schools/core-values/preventing-violence-bullying

https://www.internetmatters.org/parental-controls/social-media/

https://saferinternet.org.uk/blog/free-internet-safety-resources-for-parents