Skip to main content Scroll Top

LEGAL STRATEGIES FOR PROTECTING INDIAN MINORS AGAINST HARMFUL SOCIAL MEDIA CONTENT

Social media apps play a vital role in shaping the minds of children. It is essential to provide children with high-quality content to support their long-term development and alleviate mental

INTRODUCTION

Social media apps play a vital role in shaping the minds of children. It is essential to provide children with high-quality content to support their long-term development and alleviate mental stress. In the survey of the Child Mind Institute, it has been found that those children who use social media apps such as Instagram, Facebook or Snapchat are found with increased feelings of depression, anxiety, poor body image and loneliness.[1] In India, 60 % of Children between the ages of 9 and 17 spend over three hours daily on social media.[2] By keeping in mind the well-being and welfare of children, India should also take strict steps to ensure the digital protection of children, as the United Kingdom and Australia have done.

AUSTRALIAN LEGAL PROTECTIONS FOR MINORS AGAINST HARMFUL ONLINE CONTENT 

According to the Online Safety Amendment (Social Media Minimum Age) Act 2024 of Australia[3], children below sixteen years old are legally restricted from creating or maintaining accounts on social media apps. The legal burden of maintaining this order falls entirely on social media platforms, not on parents or guardians of children under Section 63D and 4 of this Act.[4] Companies are required to follow guidelines provided by the eSafety Commissioner to prohibit children under 16 from accessing their platform. They have to install the live selfie feature on their platform to recognise the age of children. They also have to verify their government-issued ID or banking ID, and they can install behavioural tracking features in their app, which means the app uses AI to analyse signs, such as typing speed or user interest, to identify if the user is lying about their age. If the company fails to comply with the provided rule, then it is liable to pay a higher amount as a civil penalty.[5]

UK STATUTORY PROTECTIONS FOR MINORS AGAINST HARMFUL ONLINE CONTENT 

According to the Online Safety Act (Enforced from July 2025)[6], it is the duty of social media platforms to protect children from harmful and toxic content such as violence, self-harm, pornography, or encouraging eating disorders to ensure this rule companies are required to use effective modes of age verification such as facial age estimation through AI, credit card checks or photo ID and investigation of non-compliance by compony is done by Ofcom under Section 12 and 26 of this Act.[7] Every major platform is required to publish a children’s risk assessment to determine the level of risk encountered by users of various kinds of illegal content on their apps and they are also required to outline the actions taken to mitigate risks as per Section 9(2) of this act.[8]

INDIAN LEGAL FRAMEWORK FOR MINORS AGAINST HARMFUL ONLINE CONTENT

According to the Digital Personal Data Protection Act 2023 (DPDP) social media platforms must secure verifiable parental or guardian consent before processing any data from children. This includes restrictions on tracking their online behaviour or monitoring their activities and also places restrictions on third parties such as companies or entities to prevent them from serving targeted advertisements to children, emphasising stronger protections for minors’ data by preventing manipulative online practices. If platforms fail to comply with these rules then they are liable for a fine of up to ₹200 crore under Section 9 of this Act.[9]

LEGAL REFORMS TO PROTECT MINORS AGAINST HARMFUL ONLINE CONTENT IN INDIA

In Section 9 of the DPDP Act 2023 parental consent is more emphasised rather than imposing a duty of care towards platforms for online content because granting consent does not ensure that parents remain informed about the content their child consumes, especially when children use social media through their parents’ accounts in their parents’ absence.

In this Section, the categories of harmful and toxic content are not explained; furthermore, the specific type of content that constitutes unhealthy material is not properly defined.

India should also make it mandatory for social media apps to install a live selfie feature on their platforms for the age estimation of children through AI at the start of the app, like Australia and the UK, so that children are not able to use social media platforms through their parents’ accounts in their absence.

Social media platforms should install a behaviour-tracking system, such as typing speed and user interests, on every account of their users to ensure that minors cannot use social media apps through their parents’ accounts, or bypass facial verification restrictions using fake photos or other technology.

India should also find a way to prohibit children from using VPNs to bypass restrictions.

Following the example of the UK,[10] India should also make it a legal requirement for social media platforms to publish children’s risk assessments. These reports must determine the level of risks encountered by users of unhealthy or harmful content found on social media apps and outline the specific actions taken to mitigate these risks.

CONCLUSSION

India should learn from Australia and the United Kingdom to improve its legal structure to protect its minors against harmful social media content. Some unhealthy online content can adversely impact the minds of children, causing poor body image, depression, and loneliness, which negatively affect their mental health. As children naturally try to imitate the actions of others, exposing them to adult or violent content can end their childhood or cause aggressive behaviour among them. Children are our future generations; exposing them to educational and scientific content can increase their productivity and foster their development more effectively than exposing them to adult or violent content. These children are our future lawyers, doctors, scientists, civil servants, voters, government officers, or lawmakers; therefore, we should try our best to protect their childhood and give them a bright future.             

Author(s) Name:  Dhani Devaangan (Hidayatullah National Law University, Raipur)

References:

[1] Rachel Ehmke, ‘How using social media affects teenagers’ (Child Mind Institute, 1 December 2025) <https://childmind.org/article/how-using-social-media-affects-teenagers/> accessed date 24 December 2025

[2] The survey team, ‘60% children spend 3 hours a day on social media: study’ (The Times of India, 23 September 2023)< https://timesofindia.indiatimes.com/city/mumbai/60-children-spend-3-hours-a-day-on-social-media-study/articleshow/103878956.cms> accessed date 25 December 2025

[3] Online Safety Amendment (Social Media Minimum Age) Act 2024

[4] Ibid, s 63D and 4

[5] eSafety Commissioner, ‘Regulatory guidelines’ (Australian Government, 10 December 2025) <https://www.esafety.gov.au/industry/regulatory-guidance#:~:text=The%20Online%20Safety%20Codes%20and%20Standards%20Regulatory%20Guidance%20helps%20service,to%20improve%20transparency%20and%20accountability> accessed date 25 December 2025

[6] Online Safety Amendment (Social Media Minimum Age) Act 2024

[7] Online Safety Act (Enforced from July 2025), s 12 and 26

[8] Online Safety Act (Enforced from July 2025), s 9(2)

[9] Digital Personal Data Protection Act 2023 (DPDP), s 9

[10] The Rt Hon Peter Kyle MP and Department for science, innovation and Technology, ‘Keeping children safe online: Changes to the Online Safety Act explained’ (GOV.UK, 1 August 2025) <https://www.gov.uk/government/news/keeping-children-safe-online-changes-to-the-online-safety-act-explained#:~:text=While%20Virtual%20Private%20Networks%20(%20VPNs,to%20protect%20freedom%20of%20expression.> accessed date 25 December 2024