Insights

Digital Platforms and Child Protection: Implementation of Minister of Communication and Digital Regulation No. 9 of 2026

By Nathaniel Alexander Putra Salindeho / 14 April 2026

child-2.jpg

Understand the obligations of digital platforms to protect children under Regulation of the Ministry of Communication and Digital No. 9 of 2026, including age verification, risk assessment, and sanctions for non-compliance.

Key Points

  1. Digital platforms are required to conduct a self-assessment and report it to the government within 3 months from the enactment of the regulation. Reports that are inaccurate or misleading may result in administrative sanctions and may be referred to law enforcement authorities.
  2. Social media services are automatically categorized as high-risk, therefore they are required to deactivate accounts belonging to children under the age of 16 and provide parental control features.
  3. These obligations apply not only to social media, but to all digital platforms that can or may be accessed by children, including gaming applications, streaming services, marketplaces, and educational platforms.

Background

Protection of children in the digital space has become an important concern in Indonesia’s regulatory framework. In February 2025, the government issued Government Regulation No. 17 of 2025 on the Governance of Electronic System Operations for Child Protection, which sets out the obligations of digital platforms toward child users. However, the provisions of the regulation remain general in nature and require more technical implementing rules.

As a follow-up, Minister of Communication and Digital Regulation No. 9 of 2026 was issued to provide more detailed rules on the obligations of digital platform operators, including user age verification, child risk assessments, as well as supervision and enforcement mechanisms. With the enactment of this regulation, the protection of children in the digital space has become an operational standard that must be complied with and can be legally enforced. Within this regulatory framework, a child is defined as any individual under 18 years, which in principle aligns with the definition of a child under the Child Protection Law.

Who is affected?

Platforms must publicly disclose the minimum age required to access their services and must categorize child users into five legally defined age segments: 3–5 years, 6–9 years, 10–12 years, 13–15 years, and 16–17 years.

This classification serves as the basis for restrictions on account ownership, as follows:

  • Children under the age of 13 may only have accounts on platforms that are specifically designed for children and classified as low-risk, with parental consent.
  • Children aged 13 up to under 16 years old may only have accounts on low-risk platforms, also with parental consent.
  • Meanwhile, children aged 16 up to under 18 years old may have accounts on any platform, but parental consent is still required.

Anyone who provides, manages, or operates digital platforms accessible via the internet must comply with this regulation, including social media, streaming services, online games, marketplaces, and applications that can be accessed by children.

User Age Verification

Platforms must also communicate age restrictions clearly and transparently. Information about age requirements must be presented in language that is understandable both to children and to parents or guardians, and the information must remain accessible throughout the entire lifecycle of the product or service.

Risk Assessment: High or Low?

Each product, service, or feature must be assessed for its level of risk to children. There are seven aspects that form the basis of this self assesment:

  • Contact with strangers;
  • Exposure to harmful content (pornography, violence, etc.);
  • Exploitation of children as consumers;
  • Threats to children’s personal data;
  • Potential for addiction;
  • Disruption to psychological health;
  • Physiological disturbances (physical health).

Social Media Automatically Classified as High-Risk & Parental Controls

Social networking services and social media platforms are by default classified as high-risk platforms. A service is categorized as a social networking or social media service if it enables social interaction between users, allows users to connect with one another, and/or allows users to upload content to the platform. Consequently, any provider that meets these criteria must disable accounts belonging to children under the age of 16, unless it can demonstrate a low-risk profile through the official assessment process.

For platforms that allow children to interact with other users who are not known to them, parental control features must be available. Parents must be able to grant or withdraw consent for their child’s activities on the platform.

Reporting Obligations to the Government

Digital platforms are required to conduct a self-assessment and report the results to the Minister through the Director General responsible for the supervision of the digital space and personal data protection. This report must be accurate and legally accountable. If the submitted data is proven to be inaccurate or misleading, the platform may face administrative sanctions and may even be reported to law enforcement authorities.

What happens in case of violation?

Supervision is carried out by the Ministry of Communication and Digital. If an alleged violation is identified, whether through government monitoring, public reports, or user complaints, the platform will undergo an examination process. Sanctions that may be imposed include written warnings, administrative fines, temporary suspension of services, and access termination.

Platforms that disagree with the imposed sanctions may file a formal objection within 21 working days. If they remain dissatisfied with the outcome, they may file a lawsuit with the State Administrative Court.

What Are the Implications for Businesses?

For businesses operating digital platforms in Indonesia, this regulation introduces several concrete obligations that must be prepared immediately:

  • Conducting an internal audit of products and features that may be accessed by children;
  • Preparing self-assessment documentation before products can be accessed by children;
  • Implementing reliable age verification technology;
  • Developing privacy policies and parental control mechanisms in accordance with the regulation.

The deadline for submitting the first self-assessment report is 3 months from the enactment of the regulation (6 June 2026), meaning digital platforms must act quickly.

Key Contacts

Please get in touch with the designated key contacts via phone or email if you have any inquiries or would like to learn about the potential impact on your business.
Image

Ivor I. Pasaribu

Managing Partner
+62 21 2276 1962
Image

Even Alex Chandra

Partner
+62 21 2276 1962

IGNOS Connect

Get the latest news and insights delivered to your inbox with IGNOS Connect.

Image

Office Address

Sovereign Plaza 6th Floor, Unit C
Jl. TB Simatupang Kav. 36
Jakarta Selatan 12430, Indonesia

Telephone : +62 21 2276 1962
Facsimile : +62 21 2276 1963
Email : info@ignoslaw.com

About Us

IGNOS is a top-notch Indonesian full-service law firm that values genuine alliances with clients.

Our team of experienced Lawyers in Jakarta, Indonesia, is here to assist you. Contact our reputable law firm for legal advice and representation.