top of page

Singapore Passes Landmark Online Safety Law Targeting Deepfakes, Doxxing, and Digital Harassment

Singapore has taken a decisive step toward regulating digital harm and platform accountability with the passage of the Online Safety (Relief and Accountability) Act on November 5, 2025. The new law, described by policymakers as one of the world’s most stringent frameworks for online safety, seeks to combat deepfakes, doxxing, online harassment, and other forms of digital abuse that have proliferated across the internet.

The Act establishes a new Online Safety Commission, an independent body vested with powers to order the removal of harmful content, compel disclosure of anonymous offenders, and mandate takedowns within 24 hours of a complaint being verified. Victims will now be able to report violations directly to the Commission, which can direct online platforms to take immediate action.

“Online harm is no longer an abstract concern—it affects real lives, reputations, and communities,” said Singapore’s Minister for Communications and Information, Josephine Teo, while presenting the Bill. “This law provides both a clear deterrent and a swift remedy for victims of online abuse.”

A Shift Toward Platform Accountability

At its core, the Online Safety (Relief and Accountability) Act marks a paradigm shift in how governments view social media regulation. Rather than relying solely on criminal penalties or post-facto civil remedies, Singapore has introduced a preventive and remedial mechanism that directly targets the infrastructure of harm—digital platforms themselves.

The new Online Safety Commission will have the authority to:

  • Order the removal or restriction of harmful content within 24 hours.

  • Direct online platforms to reveal the identities of anonymous perpetrators.

  • Issue compliance orders against platforms that repeatedly fail to act.

  • Provide a recourse mechanism for victims to seek swift takedown of abusive content.

This model effectively creates a quasi-judicial regulatory structure for the internet—one that balances victim relief, public accountability, and procedural oversight.

By placing the burden of rapid compliance on platforms, Singapore signals a new phase of regulatory assertiveness that many governments, including India, have been debating but hesitating to implement.

Why This Matters in the Global Context

Globally, the Act is being watched as a potential template for modern internet governance. It echoes the regulatory ambitions seen in the European Union’s Digital Services Act (DSA) and the United Kingdom’s Online Safety Act, but goes a step further by integrating real-time response obligations.

The Act’s focus on deepfakes and doxxing also reflects an evolving understanding of online harm—where manipulation and exposure of personal information can cause damage as severe as direct threats or defamation.

Countries across Asia are grappling with similar concerns. India, for instance, has been discussing amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, to expand intermediary liability and introduce user grievance mechanisms. However, Singapore’s approach—combining swift victim recourse with state oversight—is notably more centralized and interventionist.

How the Law Defines and Targets Digital Harm

The legislation adopts a broad but detailed definition of digital harm, encompassing:

  • Deepfakes: Digitally altered videos or images intended to mislead or defame.

  • Doxxing: Publishing private information without consent, leading to harassment or intimidation.

  • Online harassment and bullying: Content that causes emotional distress or public humiliation.

  • Hate speech and incitement: Communications promoting violence or discrimination.

Each category of harm has corresponding response protocols—from takedown orders to identity disclosure requests.

Platforms will be required to maintain internal compliance units, designate local points of contact, and submit annual transparency reports to the Commission detailing the volume of complaints, actions taken, and pending investigations.

The Enforcement Architecture: Commission and Compliance

Under the new law, the Online Safety Commission will serve as the central enforcement body. It will operate independently but report annually to Parliament.

Key powers include:

  • Direct Takedown Orders: Mandating removal of specific content identified as harmful within 24 hours.

  • Identity Disclosure Orders: Compelling platforms to provide information about users responsible for online abuse, even if pseudonymous or anonymous.

  • Platform Accountability Reviews: Conducting audits of major platforms to assess risk management and policy compliance.

Failure to comply can result in financial penalties and, in serious cases, temporary access restrictions on platforms in Singapore.

The Commission’s decisions can be appealed to the High Court, ensuring judicial oversight and procedural fairness.

Balancing Free Expression and Safety

One of the most debated aspects of the Act during parliamentary discussion was the balance between online safety and freedom of expression.

Civil society groups raised concerns that the broad discretion granted to the Commission could lead to overreach or censorship, particularly in cases involving satire, activism, or political commentary.

Minister Teo, however, emphasized that the law includes “strict safeguards and review mechanisms” to prevent misuse. Orders must be based on specific harm criteria, and the Commission’s proceedings are subject to appeal.

Legal experts note that the framework appears to lean toward victim protection over platform neutrality, a choice that reflects Singapore’s longstanding philosophy of prioritizing public order and social harmony in its digital policies.

Implications for Tech Companies

For major tech platforms operating in Singapore—such as Meta, Google, and TikTok—the new Act represents both a compliance challenge and an operational shift.

Companies will need to establish:

  • Dedicated legal and policy teams in Singapore for rapid response.

  • Real-time content moderation systems that can process government takedown requests.

  • Mechanisms for user verification and identity tracing, consistent with privacy laws.

Failure to comply could expose them to significant monetary fines and public compliance audits.

The Act also sends a clear message: passive moderation policies are no longer sufficient. Platforms will be held to a higher standard of proactive responsibility.

Lessons and Reflections for India

Singapore’s Online Safety Act arrives at a critical moment for India, which is currently reviewing its Digital India Act—a proposed replacement for the two-decade-old Information Technology Act, 2000.

While India’s regulatory focus has so far centered on data protection and intermediary liability, the Singaporean model could inspire a more victim-centric and enforcement-driven approach to online harm.

Three key lessons stand out for India’s policymakers:

  1. Centralized Oversight: The creation of a specialized Commission ensures consistency and expertise in enforcement, something India’s fragmented grievance mechanisms often lack.

  2. Swift Relief for Victims: The 24-hour takedown mandate prioritizes victim protection and minimizes prolonged exposure to harm.

  3. Transparency Obligations: Annual reporting and disclosure requirements create accountability not just for users, but also for platforms.

However, India’s scale and diversity present unique challenges. A centralized model could risk bureaucratic overload or selective enforcement unless backed by robust procedural safeguards and digital literacy initiatives.

A New Era of Digital Responsibility

The passage of the Online Safety (Relief and Accountability) Act marks a turning point in the global conversation on online harm. Singapore has moved beyond debate to implementation, setting an example of regulatory decisiveness in a domain where most governments are still playing catch-up.

Its model—combining swift enforcement, transparency, and victim empowerment—may influence upcoming reforms in jurisdictions from India to Australia.

 
 
 

Comments


BharatLaw.AI is revolutionising the way lawyers research cases. We have built a fantastic platform that can help you save up to 90% of your time in your research. Signup is free, and we have a free forever plan that you can use to organise your research. Give it a try.

bottom of page