Exploring the Impact of Social Media on Adolescents: Insights from Netflix's 'Adolescence'

The Netflix series Adolescence has ignited crucial discussions about the profound influence of social media on today's youth. This article highlights the urgent need for responsible and psychologically informed content moderation. Drawing from real-world data, social and emotion psychology theories, it underscores how advanced moderation technologies, like those built into our social media solution, can help prevent harm by identifying nuanced patterns of toxicity, decoding hidden language, and supporting healthier digital environments.

A scene from the series Adolescence (Episode 3) with Jamie standing in front of Briony Ariston, the child psychologist.
Written by
Amanda Brooks
,
Head of Operations
Maria Tramontano
,
Advisory Psychologist
Dr. Suvajit Majumder
,
Principal AI Engineer
Alessandra Balliana
,
Head of Strategy & Design

The role of social media in 'Adolescence'

The story centres on 13-year-old Jamie Miller (Owen Cooper) and his arrest in connection with the tragic murder of his classmate, Katie Leonard. Jamie's descent into harmful online subcultures highlights the potential dangers within the digital realm. This fictional storyline mirrors real-world concerns and portrays how social media can become a breeding ground for detrimental ideologies — including those propagated within the manosphere — where misogyny and toxic masculinity are normalised, profoundly impacting impressionable minds. Jamie’s radicalisation emphasises the urgent need for awareness and intervention in the digital lives of young individuals.​

Real-world implications: social media's impact on UK adolescents

The concerns raised in Adolescence are not confined to fiction. In the UK, the prevalence of social media among high school-aged children has been linked to various issues, including cyberbullying and associated crimes. According to the Anti-Bullying Alliance, four in ten children aged 8-17 (39%) have experienced bullying, either online or offline. Among these, bullying was more likely to occur on a device (84%) than face-to-face (61%). ​

Furthermore, the Office for National Statistics reported that in the year ending March 2023, 67.5% of children who experienced in-person bullying and 64.7% who experienced online bullying indicated that the perpetrators were from their school. This data highlights the pervasive nature of bullying within educational environments and its extension into the digital sphere. ​  

Psychological perspectives on social media and adolescence

Adolescence is a period of profound and delicate change, in which the individual leaves behind childhood and begins the journey toward building an adult identity. It is a phase of exploration, the bond with the family loosens to make space for the influence of peers who become a fundamental mirror through which one can recognise and define themselves.  

A group of teenagers sitting on a wooden stairs

This is a demanding task for the adolescent brain which undergoes intense restructuring involving three key areas:

  • The limbic system – the center of emotions – is particularly active and explains the emotional intensity, reactivity, and impulsiveness typical of this stage of life.
  • The prefrontal cortex – responsible for control, judgment, and planning – is still developing and does not yet allow for full capacity to assess the consequences of one’s actions.
  • The reward system – hypersensitive to dopamine – makes adolescents more exposed to the pursuit of novelty, immediate gratification, and risky behaviors.

The disconnect between emotion and reason makes adolescents particularly vulnerable to external influences. Today, social networks are one of the primary spaces where identity is formed. Adolescents share their inner world online, seek validation from peers, and receive constant feedback that helps shape their sense of self.  

Young people have developed their own language on social media and share a common culture that makes these environments highly meaningful in their lives. However, social platforms also represent spaces where it is easier to disengage from personal responsibility, where communication is mediated by a screen and no longer face-to-face.

This leads to the perception that one's actions have fewer negative consequences

In an unmoderated digital environment, constant exposure to hate, aggressive comments, and unrealistic models can have serious consequences. Receiving insults about one’s body or identity can generate deep-seated feelings of inadequacy, and damage self-esteem, increasing the risk of social withdrawal or contributing to the development of eating disorders, depression, or social anxiety.

Spending time in toxic virtual environments may normalize aggression and push adolescents to conform to harmful models just to feel accepted, resulting in dysfunctional affiliations driven by imitation and the need for belonging.

In this context, social media moderation plays a crucial role. It is not merely a technical function, but a collective act of care. A toxic online environment can increase the risk of harm, especially when combined with other personal or environmental challenges. At the same time, social media can play a supportive role in promoting adolescent mental health.

Those who design and manage digital spaces hold educational responsibility. Providing clear boundaries, emotional containment, and protection means making a concrete contribution to the psychological development of a generation still in formation.

This objective makes the integration of solid psychological foundations into moderation systems essential. Online behavior is not separate from real life, it reflects emotions, relationships, social dynamics, and individual vulnerabilities. For this reason, an effective moderation system must consider not only words, but also context, implicit meaning, and the psychological effects that content can produce.

In our content moderation solutions, the integration of validated psychological models within the moderation system is a strategic choice. We rely on recognized psychological theories, including emotion psychology and social psychology, to define metrics, build datasets, and validate labelling.

From a technical standpoint, this approach translates into concrete benefits:

  • Improved classification performance, thanks to more granular and psychologically grounded categories that go beyond keyword detection.
  • Reduction of false positives and false negatives, through combined analysis of content, tone, context, and intent, supported by up-to-date psychological frameworks.
  • Advanced contextual toxicity detection, capable of distinguishing between sarcasm, venting, real threats, or emotional distress.
  • Cross-context adaptability—psychological knowledge enables the system to be calibrated for different age groups, at-risk populations, or specific digital environments (e.g., social platforms, gaming, corporate settings).
  • Predictive capabilities, through the analysis of behavioral patterns over time which helps detect toxic dynamics before they escalate.

Thanks to this approach, moderation becomes a preventive, adaptive, and user-centered strategy. The system does not merely "contain harm" but actively contributes to the creation of healthier, more respectful, and psychologically sustainable digital environments. Only through real collaboration between technology and psychological science can we effectively address the complexity of human behavior online, protecting individuals, both adolescents and adults and fostering a digital landscape that is truly developmental, inclusive, and safe. In doing so, we can develop solutions aligned with the principles of ethics, prevention, and social responsibility.

Decoding emojis and slang: the hidden language of social media

The series also sheds light on the use of emojis and slang as covert communication tools among teenagers. In Adolescence, characters employ encoded language to convey messages, sometimes with harmful intentions. This mirrors real-life situations where adolescents use slang, emojis, or shorthand to obscure meaning, making it increasingly difficult for parents and educators to interpret online interactions.  


A group of emojis that are used by adolescents in communications as encoded messages


Freedom2hear’s text and multimedia moderation systems are designed to adapt to this fast-evolving digital language, using AI to build and update a contextual understanding of both text and multimedia in real-time.

AI-based anomaly detection and pattern recognition are key to identifying and tracking emerging risks in social media communications. At Freedom2hear, we leverage state-of-art knowledge and the latest cutting-edge advancements in AI technologies and tools to develop our own foundational AI systems and provide the best accuracy for our clients. We achieve this by constantly monitoring and iterating on our existing models and pipelines. With a feedback-driven approach, we ensure our models evolve alongside both technology and emerging social behaviour.

Conclusion

Adolescence serves as a stark reminder of the complexities and potential dangers inherent in the digital landscape for not just today's youth, but for all users. By fostering open dialogues, promoting digital literacy, and implementing advanced content moderation technologies like those developed on our Freedom2hear Social solution, we can work towards creating a safer, more inclusive online environment. It is imperative that we continue to address these challenges collaboratively, ensuring that social media remains a space for positive connection and growth for all.

Sources

Prevalence of online bullying, Anti Bullying Alliance

Bullying and online experiences among children in England and Wales: year ending March 2023,  Office for National Statistics

Dustin Wahlstrom, Tonya White, Monica Luciana, 2021, “Neurobehavioral evidence for changes in dopamine system activity during adolescence”, National Library of Medicine

Ilaria Cataldo, Bruno Lepri, Michelle Jin Yee Neoh, Gianluca Esposito, 2021, “Social Media Usage and Development of Psychiatric Disorders in Childhood and Adolescence: A Review”, Frontiers

More articles

Understanding the world we live in

Training for Gold: Why Empowering Humans in Content Moderation Matters

Imagine training for the Olympics – the ultimate challenge where every second counts and every decision matters. Now, imagine doing it without a coach. Difficult, right? In this metaphor, our AI is the coach, and the human moderators are the athletes striving for gold.

Read full article

A growing challenge in Women’s Sport

In recent years, online abuse targeting female athletes has escalated dramatically, posing significant risks to their well-being and professional careers. These disturbing trends underscore the urgent need for stronger protections for athletes, particularly female athletes, who are disproportionately impacted by such abuse. The situation highlights the growing importance of creating safer, more inclusive online spaces within the world of sports.

Read full article

Benchmarking LLMs for Emotion Intelligence

The blog post outlines the challenges in benchmarking emotional intelligence in AI systems, highlighting issues such as subjective scene settings, ambiguous labelling, and hidden assumptions that often lead to inconsistent evaluations. It calls for an interdisciplinary, nuanced approach - one that not only measures outcomes but also examines the reasoning behind responses, and their consistency - to better capture the complexities of human emotion in real-life scenarios.

Read full article
Together we can make change

Book a demo today

Ready to see how Freedom2hear can transform your content moderation strategy? Book a free demo with one of our experts and discover how our solutions can help you create safer and more engaging online spaces.

Book a demo