- Louise Holly, policy and research coordinator
- @louise__holly
Across the globe, countries are grabbling with how to shield children from the dangers of the digital world. One idea is to prevent children from accessing the most harmful digital spaces. Australian has just introduced a bill to parliament proposing to ban children under 16 from using social media.1 If passed, it would make Australia the first country in the world to have such strict legislation. Mitigating the physical and mental health risks posed by digital platforms is an urgent public health priority that governments need to engage with, but limiting children’s online participation will not tackle the root causes of these harms.
Australia’s proposed law will make it illegal for anyone under 16 to have a social media account and wouldn’t have exemptions for parental consent. Companies such as Meta, ByteDance, and Google will be responsible for checking that users are not under 16 and could face penalties if they fail to comply.2 If the law is passed, it wouldn’t take effect for over a year, giving regulators and companies time to implement age verification systems and resolve concerns about privacy and data protection.
Other countries are also looking to take a stand against social media. France, for example, is considering limiting social media use for children under 15.3 The UK is considering social media restrictions on under 16s after campaigning by parents’ groups.4 In India, some are calling for the minimum age to be raised to 18.5
This wave of proposed legislation signals a growing consensus that children’s digital lives need more thoughtful guidance and safeguards. The covid-19 pandemic highlighted the importance of online connection, but as children spend more time in front of screens the negative effects on their health and wellbeing are becoming harder to ignore.
Excessive use of smartphones, gaming consoles, and other tech devices disrupts children’s sleep, reduces physical activity, and leads to physical health problems like eye strain.678 The idealised nature of online content and pull of 24/7 social interactions also contribute to a decline in mental health.9 Platforms that focus on images and videos place immense pressure on young people to meet unrealistic beauty standards, seek constant validation, and stay connected at all times.
Children’s exposure to online bullying, exploitation, abuse, and misinformation is rising.10 Influencers and unregulated digital marketing also promote unhealthy foods, harmful products, and gambling to younger audiences.11 While adults are exposed to these dangers too, children’s developing brains are particularly vulnerable to the impulsive behaviours and social rewards encouraged by these platforms through features such as infinite scrolling, notifications, and tailored content feeds.12
Distraction from the bigger picture
Social media is being singled out by policy makers because it is where children spend much of their time online. In the UK, for example, half of children under 12 use social media.13 Teens spend an average of 4.8 hours on social media a day.14 Exerting greater control over the social media giants is also an appealing move for politicians who wish to rein in the power held by tech chief executives and their companies. In principle, government intervention to limit children’s access to social media is necessary and long overdue; social media is not designed for children, and protective measures introduced by the platforms have not gone far enough. Government action has been crucial in curbing other threats to public health—such as tobacco use, road safety, and obesity—and the same bold approach is needed here.15
However, the proposal to simply ban children from social media will not automatically make these spaces safer. Children who find ways to bypass these restrictions (which they will) won’t be protected. Neither will young people who are exposed to harm while gaming or browsing other parts of the internet. A narrow focus on social media bans distracts policy makers from confronting the addictive features, algorithm driven content, and commercial incentives that dictate what children see, keep them hooked, and expose them to harmful content. Broader legislation, like the UK’s Online Safety Act and the EU’s plans to tackle addictive and harmful design features, offer more promising approaches to resolving these underlying problems.1617
Children’s access to and use of digital technologies is a determinant of their health and wellbeing.18 As with other public health issues, a range of interventions targeted at individuals, communities, and populations are needed to encourage healthy digital practices and reduce risks. Limiting screen time and delaying children’s access to digital technologies is important, but it must be part of a larger strategy that includes investing in alternative, health-promoting activities for young people, such as playgrounds, sporting facilities, and youth clubs.
To enable children to make informed decisions about social media and to use digital tools in ways that enhance their wellbeing, we have to equip them with the necessary knowledge and skills.19 We also need to create a supportive environment where healthy tech habits are modelled by adults. Children learn by example, and if they see the adults around them constantly on their devices, they are less likely to follow advice on healthy technology use.
Children have a right to benefit from digital advancements. Instead of focusing solely on restricting children’s access, we need to make digital spaces safer and healthier. This involves fundamentally redesigning how social media and other digital platforms operate—so that they can serve the needs of young users without putting them at risk.
Footnotes
-
Competing interests: None declared.
-
Provenance and peer review: Commissioned; not externally peer reviewed.
-
AI use: LH used ChatGPT to proofread the draft and suggest some cuts but not to write anything.