My Thoughts on Netflix's Adolescence and Social Media

Adolescence is the British crime drama everyone’s talking about. It explores toxic masculinity, incel culture, and online radicalisation, sparking urgent discussions about social media’s role in shaping modern boyhood. Can we strike a balance between online freedom and safeguarding against harm?

If you're not already immersed in the world of Netflix, Adolescence is what everyone is talking about at the moment. For those who don't know, Adolescence is a British crime drama miniseries that follows 13-year-old Jamie Miller, who is arrested for the murder of his classmate, Katie Leonard.

The show delves into the influences of toxic masculinity and online subcultures on youth, highlighting themes of incel (involuntary celibate) culture and gender-based violence. As a result, the series has sparked discussions about societal issues related to modern boyhood and the effects of online radicalisation.

Now, I'm not one for reviewing TV shows on this blog, but there was something about this show that compelled me to write this post. Whilst watching, the one thought (and, I’m sure, the thoughts of many parents) that kept racing through my mind was, “What if this was my son?”

Now, my son isn’t 13—he’s only a toddler—but it got me thinking more and more about what kind of world we are raising him in and the impact of role models, education, and social media. The latter of which I’m very familiar with.

Social media plays a leading role, as platforms such as Instagram (the culprit platform in the series) foster a complex, highly emotive environment in which teens—particularly teenage boys—are exposed to extremist content and negative influencers such as Andrew Tate.

I read a very interesting article by Dr Justin Coulson of Happy Families, which went around the family group chat discussing the series. Dr Coulson made an insightful comment, stating:

“The tech companies are accumulating the greatest fortunes ever built in the history of the world while denying responsibility for the toxic spaces they’ve created.”

I would like to take this a step further by pointing out that this is a multifaceted problem. While platforms are ultimately responsible (and should be held to account), several components need to be considered. These include community dynamics (the people and demographics using the platform), platform features (e.g. the ability to repost and comment, the “algorithm”, etc.), and affordances (how users make use of these features).

All these factors need to be carefully considered before pointing the finger solely at the platform. While social media companies have a duty to mitigate harm, the reality is that this issue is deeply entrenched in a web of societal, technological, and individual factors. Addressing the problem effectively requires a holistic approach that considers not only how platforms function but also how users interact with them and the broader cultural forces at play.

I’ve mentioned on numerous occasions my concerns about platforms such as X and the rise of decentralised social media. The promise of decentralisation is that it can remove the monopolistic control of tech giants and promote freedom of expression. However, with that freedom comes significant challenges—chief among them, the difficulty of moderation and the potential for harmful content to spread unchecked. That said, decentralisation also presents opportunities for more community-driven moderation, where smaller, self-governing groups can implement stricter content policies suited to their audiences. By distributing moderation responsibilities, it may be possible to limit the spread of extremist content more effectively than relying on a single centralised authority.

This leads us to the critical question: Can we implement the necessary safeguards to ensure that social media does not continue to be a breeding ground for extremism and radicalisation? What measures can be taken to reduce the risks while preserving the core benefits of these platforms?

Striking the right balance will be key. Effective solutions may include improved content moderation (perhaps with AI assistance, but always with human oversight), stronger digital literacy education for young users, and better parental guidance tools. Social media companies must take greater responsibility, but parents, educators, and policymakers also play a crucial role in shaping the digital environment that young people navigate daily.

Ultimately, if we want a healthier future for social media, it will require a concerted effort from all stakeholders. Encouraging critical thinking, fostering positive online communities, and ensuring that tech companies uphold their ethical responsibilities could help pave the way towards a more responsible and safer digital landscape for the next generation.