With the new year upon us, and particularly against the backdrop of the rapid advancement of AI tools and their increasing deployment across social media platforms and websites, it could not be more timely for the long-awaited Online Safety Act 2025 (“OSA 2025”) to officially come into force on 1 January 2026.
The commencement of the OSA 2025 is both timely and necessary, as at the very start of the year, the digital ecosystem was confronted with one of the most disturbing and deeply unsettling online trends, in which random users were able to prompt an AI chatbot on one of the world’s largest social media platforms to digitally undress individuals using nothing more than any publicly available photograph. In many cases, victims were made to appear as if they were wearing tiny bikinis, and in even more egregious instances, their images were digitally manipulated into explicit sexual positions.
Within hours, the trend went viral, and users of that specific social media platform moved rapidly from one account to another, indiscriminately targeting individuals whose photographs were accessible on the platform. With a single prompt, the AI system digitally generated altered images that stripped subjects of their dignity, autonomy, and humanity. Unsurprisingly, and profoundly troublingly, the overwhelming majority of victims were women and minors.
This is precisely the moment where the OSA 2025 is most needed, and where strong and decisive regulatory enforcement is critical to curb, deter, and address such dehumanising and outrageous conduct and behaviour in the digital space.
Given the relative newness of the legislation, and recognising that many organisations are still in the early stages of familiarising themselves with its scope and practical implications, this article aims to set out 10 key takeaways to assist organisations, particularly applications service providers and content applications service providers, in understanding how the OSA 2025 will impact their operations, governance frameworks, and compliance priorities.
Key Takeaway 1: Understanding the Legal Status and Architecture of the OSA 2025
The OSA 2025 officially comes into force on 1 January 2026. It is the principal legislation governing online safety, and is supported by several subsidiary legislations, namely the Online Safety (Period) Regulations 2025, the Online Safety (Fees) Regulations 2025, the Online Safety (Form of Undertaking) Regulations 2025, and the Online Safety (Online Safety Appeal Tribunal) Regulations 2025.
Given the importance and rapidly growing relevance of online safety in today’s digital environment, these regulations form the core legislative framework at this stage. However, it is reasonable to expect that additional regulations, guidelines, standards, and related instruments may be introduced over time to further complement, strengthen, and complete the broader online safety ecosystem, beyond these fundamental and foundational legislations.
Key Takeaway 2: Understanding Who the OSA 2025 Really Applies To
At its core, the OSA 2025 is primarily targeted at three categories of entities: (i) applications service providers, (ii) content applications service providers, and (iii) network service providers.
However, a careful reading of the OSA 2025 in its entirety reveals that the bulk, if not almost all, of its provisions are principally concerned with applications service providers and content applications service providers, with comparatively limited application to network service providers.
Therefore, if your organisation falls within any of the above three categories, the OSA 2025 will undoubtedly become a key regulatory focus throughout 2026, particularly from an implementation, compliance, and risk-management perspective.
Key Takeaway 3: Understanding the Extra-Territorial Application of the OSA 2025
The third key takeaway is understanding the extra-territorial application of the OSA 2025. The OSA 2025 has a broad extra-territorial effect, as it applies both within and outside Malaysia. In practical terms, this means that the OSA 2025 applies even to persons beyond Malaysia, so long as such persons provide any applications service, content applications service, or network service in a place within Malaysia and are licensees under the Communications and Multimedia Act 1998.
This addresses many of the long-standing concerns in this space, as we recognise and appreciate the reality that a significant number of applications services and content applications services are, in fact, provided by companies operating outside of Malaysia. The OSA 2025 squarely confronts this reality. With this extra-territorial reach, even where service providers are located outside Malaysia, compliance with the OSA 2025 remains mandatory if their services are made available within Malaysia and they fall within the relevant licensing framework.
Key Takeaway 4: Understanding Whether There Is a Duty to Implement Measures to Mitigate the Risk of Exposure to Harmful Content
The fourth key takeaway is the statutory duty to implement measures to mitigate the risk of exposure to harmful content.
The OSA 2025 is explicit and clear that a licensed applications service provider and a licensed content applications service provider shall implement the measures as specified to mitigate the risk of users being exposed to harmful content.
At the same time, the OSA 2025 also provides a degree of flexibility that a licensed applications service provider or licensed content applications service provider may implement alternative measures, other than the measures specified, if it is able to prove to the satisfaction of the Commission that such alternative measures will better mitigate the risk of users being exposed to harmful content.
This makes it abundantly clear that, regardless of whether the licensed applications service provider and licensed content applications service provider choose to implement the specified measures or adopt alternative measures, the fundamental and overriding objective remains the same, which is to mitigate the risk of users being exposed to harmful content.
Key Takeaway 5: Understanding What Falls within the Category of Harmful Content
Following the fourth key takeaway, the natural next question is, of course, what would fall within the category of harmful content, and whether harmful content is even defined under the OSA 2025.
The answer is yes. The OSA 2025 provides a clear and explicit list of content categories that fall within the definition of harmful content. This clarity is particularly valuable in an era where societal norms and moral boundaries are increasingly subjective and contested. By articulating a defined scope, the legislation significantly reduces ambiguity and compliance uncertainty.
For ease of reference, the full list of harmful content extracted from the OSA 2025 is as follows:
1. Content on child sexual abuse material as provided for under section 4 of the Sexual Offences against Children Act 2017
2. Content on financial fraud
3. Obscene content, including content that may give rise to a feeling of disgust due to lewd portrayal which may offend a person’s sense of decency and modesty
4. Indecent content, including content which is profane in nature, improper, and against generally accepted behaviour or culture
5. Content that may cause harassment, distress, fear, or alarm by way of threatening, abusive, or insulting words, communication, or acts
6. Content that may incite violence or terrorism
7. Content that may induce a child to cause harm to himself
8. Content that may promote feelings of ill-will or hostility amongst the public at large or disturb public tranquillity
9. Content that promotes the use or sale of dangerous drugs
Key Takeaway 6: Whether There Is a Duty to Publish Guidelines to Users
The sixth key takeaway concerns transparency and user communication.
The OSA 2025 is explicit and clear that, beyond merely implementing measures to mitigate the risk of exposure to harmful content, the responsibilities of licensed applications service providers and licensed content applications service providers do not end there. They are also required to issue guidelines to users, which must include (i) a description of the measures implemented, and (ii) the terms of use as a guide to users when using the services of the licensed applications service provider and licensed content applications service provider. The OSA 2025 further makes it clear that such guidelines must be clear, easily accessible, easily understood, and regularly updated.
This requirement is particularly important as it promotes transparency and, more critically, accountability. It ensures that organisations do not simply assert, behind a corporate wall, that measures have been implemented to mitigate the risk of exposure to harmful content, but are instead required to clearly explain the measures adopted, thereby allowing for meaningful public understanding, regulatory scrutiny, and corporate accountability.
Key Takeaway 7: Whether Users May Manage Their Own Online Safety
The seventh key takeaway focuses on giving control back to the user. One of the standout aspects of the OSA 2025 is its recognition that online safety is not solely the responsibility of service providers, where users themselves must also be equipped with the necessary tools to manage their own safety.
Hence, the OSA 2025 is explicit that a licensed applications service provider and licensed content applications service provider shall make available on their services sufficient tools and settings to enable users to manage their online safety. The OSA 2025 further specifies that such tools and settings shall include tools that allow users to prevent or limit other users from identifying, locating, or communicating with them on the services of the licensed applications service provider and licensed content applications service provider.
This is not only important, but also commendable, as while it remains crucial for organisations themselves to implement measures to mitigate risk, the OSA 2025 recognises that empowering users with the necessary tools and settings is equally vital in allowing them to actively manage their own online safety.
To meet this requirement, companies could consider the following as a baseline:
• Privacy and communication settings dashboards – allowing users to control who can view their profile, send messages, or find them via search or location-based features.
• Customisable blocking and reporting tools – enabling users to block or restrict interactions from selected accounts, and to manage the level of exposure their personal data receives.
These tools provide tangible mechanisms for users to assert control, complementing broader organisational safety measures.
Key Takeaway 8: Whether There Is a Reporting Mechanism for Harmful Content
The eighth key takeaway highlights the obligation to provide reporting mechanisms for harmful content.
Another crucial aspect of the OSA 2025 is that it is explicit that a licensed applications service provider and licensed content applications service provider shall make available on their services a mechanism that enables users to report any harmful content to the licensed applications service provider or licensed content applications service provider.
At first glance, this may appear to be a standard reporting function. However, in the broader regulatory context, it represents a critical layer of protection because viewed from a broader perspective, the OSA 2025 already imposes a duty on licensed applications service providers and licensed content applications service providers to implement measures to mitigate the risk of exposure to harmful content. In other words, harmful content should ideally not be present on the platform at all. However, where such content does arise, this reporting mechanism provides an additional and deeper layer of protection by allowing users to flag the issue and enabling the licensed applications service provider and licensed content applications service provider to take appropriate and timely action.
Given the length of this article and in the interest of keeping matters concise without overwhelming the reader, the next article will explore in greater depth the mechanics of submitting such reports, the expected response process, and the actions that should be taken thereafter.
Key Takeaway 9: Whether There Is Availability of User Assistance
The ninth key takeaway addresses the need for accessible and responsive user assistance. It is fair to say that most of us have, at one point or another, experienced the frustration arising from the lack of meaningful user assistance on digital platforms, a frustration that is only amplified when the issue relates to something as critical as online safety, which warrants timely and responsive support.
The OSA 2025 makes it clear that a licensed applications service provider and licensed content applications service provider shall make available on their services a user assistance mechanism that is easily accessible to all types of users and responsive at all times. At a minimum, such a mechanism must allow users: (i) to raise concerns relating to online safety; (ii) to obtain information on the available online safety mechanisms; and (iii) to make general inquiries.
With the development and deployment of AI and other advanced technologies, there is simply no excuse for the absence of such user assistance mechanisms. Therefore, what ultimately matters is not just availability, but quality, where user assistance must meaningfully address users’ concerns and meet reasonable expectations, rather than delivering generic, unhelpful, or perfunctory responses that do little to resolve the underlying issue.
Key Takeaway 10: Whether There Are Additional Child Protection Measures for Child Users
The tenth and final key takeaway focuses on child protection, an area of heightened importance in today’s increasingly digital and accessible landscape.
The OSA 2025 is explicit that a licensed applications service provider and licensed content applications service provider shall implement the specified measures to ensure the safe use of their services by child users. At the same time, the law provides regulatory flexibility, allowing licensed applications service providers and licensed content applications service providers to implement alternative measures, provided they can demonstrate to the satisfaction of the Commission that such alternative measures will better ensure the safe use of their services by child users.
Fundamentally, what is critical is that the measures implemented ensure the safe design and operation of the service, including measures:
to prevent access by a user identified to be a child to content suspected to be harmful content;
to limit the ability of a user identified to be an adult from communicating with a user identified to be a child;
to limit features that increase, sustain, or extend the use of the service by a user identified to be a child;
to prevent a user identified to be an adult from viewing the personal information of a user identified to be a child that is available on the service; and
to control personalised recommendation systems in a manner suitable for child users.
With the continued advancement of technology, access by children to digital platforms is increasing and occurring at increasingly younger ages. Against this backdrop, these measures are not merely important, they are essential to safeguarding child users in today’s digital environment.
Conclusion
In conclusion, the OSA 2025 represents a significant and far-reaching shift in the regulatory landscape for digital services in Malaysia. Its commencement on 1 January 2026 establishes a comprehensive framework that directly impacts applications service providers, content applications service providers, and, to a lesser extent, network service providers. It not only imposes clear statutory duties to mitigate harmful content and safeguard users, particularly children, but also empowers users with the tools and mechanisms to manage their own online safety.
For companies falling under the purview of the OSA 2025, the implications are immediate and material. In-house legal teams and compliance functions should recognise that familiarisation with the law is no longer sufficient, and proactive measures are now required to ensure that governance frameworks, operational processes, and user-facing mechanisms are fully aligned with the legislation.
The Technology Practice Group of Halim Hong & Quek continues to be recognised by leading legal directories and industry benchmarks. Recent accolades include FinTech Law Firm of the Year at the ALB Malaysia Law Awards (2024 and 2025), Law Firm of the Year for Technology, Media and Telecommunications by the In-House Community, FinTech Law Firm of the Year by the Asia Business Law Journal, a Band 2 ranking for FinTech by Chambers and Partners, and a Tier 3 ranking by Legal 500.
If you have any questions on the Online Safety Act 2025, please feel free to reach out to the partners at the Technology Practice Group, Ong Johnson and Lo Khai Yi, for consultation.
About the authors
Ong Johnson Partner Head of Technology Practice Group Fintech, Data Protection, Technology, Media & Telecommunications (“TMT”), IP and Competition Law [email protected]
◦ Lo Khai Yi Partner Co-Head of Technology Practice Group Technology, Media & Telecommunications (“TMT”), Technology Acquisition and Outsourcing, Telecommunication Licensing and Acquisition, Cybersecurity [email protected].
The Online Safety Act 2025 Is Finally Here: 10 Key Takeaways for Companies
With the new year upon us, and particularly against the backdrop of the rapid advancement of AI tools and their increasing deployment across social media platforms and websites, it could not be more timely for the long-awaited Online Safety Act 2025 (“OSA 2025”) to officially come into force on 1 January 2026.
The commencement of the OSA 2025 is both timely and necessary, as at the very start of the year, the digital ecosystem was confronted with one of the most disturbing and deeply unsettling online trends, in which random users were able to prompt an AI chatbot on one of the world’s largest social media platforms to digitally undress individuals using nothing more than any publicly available photograph. In many cases, victims were made to appear as if they were wearing tiny bikinis, and in even more egregious instances, their images were digitally manipulated into explicit sexual positions.
Within hours, the trend went viral, and users of that specific social media platform moved rapidly from one account to another, indiscriminately targeting individuals whose photographs were accessible on the platform. With a single prompt, the AI system digitally generated altered images that stripped subjects of their dignity, autonomy, and humanity. Unsurprisingly, and profoundly troublingly, the overwhelming majority of victims were women and minors.
This is precisely the moment where the OSA 2025 is most needed, and where strong and decisive regulatory enforcement is critical to curb, deter, and address such dehumanising and outrageous conduct and behaviour in the digital space.
Given the relative newness of the legislation, and recognising that many organisations are still in the early stages of familiarising themselves with its scope and practical implications, this article aims to set out 10 key takeaways to assist organisations, particularly applications service providers and content applications service providers, in understanding how the OSA 2025 will impact their operations, governance frameworks, and compliance priorities.
Key Takeaway 1: Understanding the Legal Status and Architecture of the OSA 2025
The OSA 2025 officially comes into force on 1 January 2026. It is the principal legislation governing online safety, and is supported by several subsidiary legislations, namely the Online Safety (Period) Regulations 2025, the Online Safety (Fees) Regulations 2025, the Online Safety (Form of Undertaking) Regulations 2025, and the Online Safety (Online Safety Appeal Tribunal) Regulations 2025.
Given the importance and rapidly growing relevance of online safety in today’s digital environment, these regulations form the core legislative framework at this stage. However, it is reasonable to expect that additional regulations, guidelines, standards, and related instruments may be introduced over time to further complement, strengthen, and complete the broader online safety ecosystem, beyond these fundamental and foundational legislations.
Key Takeaway 2: Understanding Who the OSA 2025 Really Applies To
At its core, the OSA 2025 is primarily targeted at three categories of entities: (i) applications service providers, (ii) content applications service providers, and (iii) network service providers.
However, a careful reading of the OSA 2025 in its entirety reveals that the bulk, if not almost all, of its provisions are principally concerned with applications service providers and content applications service providers, with comparatively limited application to network service providers.
Therefore, if your organisation falls within any of the above three categories, the OSA 2025 will undoubtedly become a key regulatory focus throughout 2026, particularly from an implementation, compliance, and risk-management perspective.
Key Takeaway 3: Understanding the Extra-Territorial Application of the OSA 2025
The third key takeaway is understanding the extra-territorial application of the OSA 2025. The OSA 2025 has a broad extra-territorial effect, as it applies both within and outside Malaysia. In practical terms, this means that the OSA 2025 applies even to persons beyond Malaysia, so long as such persons provide any applications service, content applications service, or network service in a place within Malaysia and are licensees under the Communications and Multimedia Act 1998.
This addresses many of the long-standing concerns in this space, as we recognise and appreciate the reality that a significant number of applications services and content applications services are, in fact, provided by companies operating outside of Malaysia. The OSA 2025 squarely confronts this reality. With this extra-territorial reach, even where service providers are located outside Malaysia, compliance with the OSA 2025 remains mandatory if their services are made available within Malaysia and they fall within the relevant licensing framework.
Key Takeaway 4: Understanding Whether There Is a Duty to Implement Measures to Mitigate the Risk of Exposure to Harmful Content
The fourth key takeaway is the statutory duty to implement measures to mitigate the risk of exposure to harmful content.
The OSA 2025 is explicit and clear that a licensed applications service provider and a licensed content applications service provider shall implement the measures as specified to mitigate the risk of users being exposed to harmful content.
At the same time, the OSA 2025 also provides a degree of flexibility that a licensed applications service provider or licensed content applications service provider may implement alternative measures, other than the measures specified, if it is able to prove to the satisfaction of the Commission that such alternative measures will better mitigate the risk of users being exposed to harmful content.
This makes it abundantly clear that, regardless of whether the licensed applications service provider and licensed content applications service provider choose to implement the specified measures or adopt alternative measures, the fundamental and overriding objective remains the same, which is to mitigate the risk of users being exposed to harmful content.
Key Takeaway 5: Understanding What Falls within the Category of Harmful Content
Following the fourth key takeaway, the natural next question is, of course, what would fall within the category of harmful content, and whether harmful content is even defined under the OSA 2025.
The answer is yes. The OSA 2025 provides a clear and explicit list of content categories that fall within the definition of harmful content. This clarity is particularly valuable in an era where societal norms and moral boundaries are increasingly subjective and contested. By articulating a defined scope, the legislation significantly reduces ambiguity and compliance uncertainty.
For ease of reference, the full list of harmful content extracted from the OSA 2025 is as follows:
Key Takeaway 6: Whether There Is a Duty to Publish Guidelines to Users
The sixth key takeaway concerns transparency and user communication.
The OSA 2025 is explicit and clear that, beyond merely implementing measures to mitigate the risk of exposure to harmful content, the responsibilities of licensed applications service providers and licensed content applications service providers do not end there. They are also required to issue guidelines to users, which must include (i) a description of the measures implemented, and (ii) the terms of use as a guide to users when using the services of the licensed applications service provider and licensed content applications service provider. The OSA 2025 further makes it clear that such guidelines must be clear, easily accessible, easily understood, and regularly updated.
This requirement is particularly important as it promotes transparency and, more critically, accountability. It ensures that organisations do not simply assert, behind a corporate wall, that measures have been implemented to mitigate the risk of exposure to harmful content, but are instead required to clearly explain the measures adopted, thereby allowing for meaningful public understanding, regulatory scrutiny, and corporate accountability.
Key Takeaway 7: Whether Users May Manage Their Own Online Safety
The seventh key takeaway focuses on giving control back to the user. One of the standout aspects of the OSA 2025 is its recognition that online safety is not solely the responsibility of service providers, where users themselves must also be equipped with the necessary tools to manage their own safety.
Hence, the OSA 2025 is explicit that a licensed applications service provider and licensed content applications service provider shall make available on their services sufficient tools and settings to enable users to manage their online safety. The OSA 2025 further specifies that such tools and settings shall include tools that allow users to prevent or limit other users from identifying, locating, or communicating with them on the services of the licensed applications service provider and licensed content applications service provider.
This is not only important, but also commendable, as while it remains crucial for organisations themselves to implement measures to mitigate risk, the OSA 2025 recognises that empowering users with the necessary tools and settings is equally vital in allowing them to actively manage their own online safety.
To meet this requirement, companies could consider the following as a baseline:
These tools provide tangible mechanisms for users to assert control, complementing broader organisational safety measures.
Key Takeaway 8: Whether There Is a Reporting Mechanism for Harmful Content
The eighth key takeaway highlights the obligation to provide reporting mechanisms for harmful content.
Another crucial aspect of the OSA 2025 is that it is explicit that a licensed applications service provider and licensed content applications service provider shall make available on their services a mechanism that enables users to report any harmful content to the licensed applications service provider or licensed content applications service provider.
At first glance, this may appear to be a standard reporting function. However, in the broader regulatory context, it represents a critical layer of protection because viewed from a broader perspective, the OSA 2025 already imposes a duty on licensed applications service providers and licensed content applications service providers to implement measures to mitigate the risk of exposure to harmful content. In other words, harmful content should ideally not be present on the platform at all. However, where such content does arise, this reporting mechanism provides an additional and deeper layer of protection by allowing users to flag the issue and enabling the licensed applications service provider and licensed content applications service provider to take appropriate and timely action.
Given the length of this article and in the interest of keeping matters concise without overwhelming the reader, the next article will explore in greater depth the mechanics of submitting such reports, the expected response process, and the actions that should be taken thereafter.
Key Takeaway 9: Whether There Is Availability of User Assistance
The ninth key takeaway addresses the need for accessible and responsive user assistance. It is fair to say that most of us have, at one point or another, experienced the frustration arising from the lack of meaningful user assistance on digital platforms, a frustration that is only amplified when the issue relates to something as critical as online safety, which warrants timely and responsive support.
The OSA 2025 makes it clear that a licensed applications service provider and licensed content applications service provider shall make available on their services a user assistance mechanism that is easily accessible to all types of users and responsive at all times. At a minimum, such a mechanism must allow users: (i) to raise concerns relating to online safety; (ii) to obtain information on the available online safety mechanisms; and (iii) to make general inquiries.
With the development and deployment of AI and other advanced technologies, there is simply no excuse for the absence of such user assistance mechanisms. Therefore, what ultimately matters is not just availability, but quality, where user assistance must meaningfully address users’ concerns and meet reasonable expectations, rather than delivering generic, unhelpful, or perfunctory responses that do little to resolve the underlying issue.
Key Takeaway 10: Whether There Are Additional Child Protection Measures for Child Users
The tenth and final key takeaway focuses on child protection, an area of heightened importance in today’s increasingly digital and accessible landscape.
The OSA 2025 is explicit that a licensed applications service provider and licensed content applications service provider shall implement the specified measures to ensure the safe use of their services by child users. At the same time, the law provides regulatory flexibility, allowing licensed applications service providers and licensed content applications service providers to implement alternative measures, provided they can demonstrate to the satisfaction of the Commission that such alternative measures will better ensure the safe use of their services by child users.
Fundamentally, what is critical is that the measures implemented ensure the safe design and operation of the service, including measures:
With the continued advancement of technology, access by children to digital platforms is increasing and occurring at increasingly younger ages. Against this backdrop, these measures are not merely important, they are essential to safeguarding child users in today’s digital environment.
Conclusion
In conclusion, the OSA 2025 represents a significant and far-reaching shift in the regulatory landscape for digital services in Malaysia. Its commencement on 1 January 2026 establishes a comprehensive framework that directly impacts applications service providers, content applications service providers, and, to a lesser extent, network service providers. It not only imposes clear statutory duties to mitigate harmful content and safeguard users, particularly children, but also empowers users with the tools and mechanisms to manage their own online safety.
For companies falling under the purview of the OSA 2025, the implications are immediate and material. In-house legal teams and compliance functions should recognise that familiarisation with the law is no longer sufficient, and proactive measures are now required to ensure that governance frameworks, operational processes, and user-facing mechanisms are fully aligned with the legislation.
The Technology Practice Group of Halim Hong & Quek continues to be recognised by leading legal directories and industry benchmarks. Recent accolades include FinTech Law Firm of the Year at the ALB Malaysia Law Awards (2024 and 2025), Law Firm of the Year for Technology, Media and Telecommunications by the In-House Community, FinTech Law Firm of the Year by the Asia Business Law Journal, a Band 2 ranking for FinTech by Chambers and Partners, and a Tier 3 ranking by Legal 500.
If you have any questions on the Online Safety Act 2025, please feel free to reach out to the partners at the Technology Practice Group, Ong Johnson and Lo Khai Yi, for consultation.
About the authors
Ong Johnson
Partner
Head of Technology Practice Group
Fintech, Data Protection,
Technology, Media & Telecommunications (“TMT”),
IP and Competition Law
[email protected]
◦
Lo Khai Yi
Partner
Co-Head of Technology Practice Group
Technology, Media & Telecommunications (“TMT”), Technology
Acquisition and Outsourcing, Telecommunication Licensing and
Acquisition, Cybersecurity
[email protected].
More of our Tech articles that you should read: