- UAE issues landmark Federal Decree‑Law on child digital safety, establishing a comprehensive legal framework to protect children online and promote responsible digital use.
- The law introduces the Child Digital Safety Council and mandates digital platforms to implement child-focused privacy settings, age verification, content filtering, and restrictions on targeted advertising.
- Caregivers, service providers, and platforms share responsibilities, ensuring children are shielded from harmful content, gambling, and online exploitation, reinforcing the UAE’s commitment to child well‑being.
The United Arab Emirates government has issued a landmark Federal Decree‑Law on child digital safety, establishing a comprehensive legal framework to protect young users from online risks and promote the safe and responsible use of digital technologies. The legislation, introduced as the UAE prepares to celebrate 2026 as the Year of the Family, reflects the country’s broader vision of safeguarding children’s well‑being across all environments, including the rapidly evolving digital space.
The new decree‑law aims to shield children from harmful digital content and practices that could negatively affect their physical, psychological or moral development. Recognising how pervasive digital platforms have become in everyday life, the legislation applies not only to internet service providers and digital platforms operating within the UAE, but also to those outside the country that target users within its borders. Covered platforms include websites, search engines, mobile and messaging applications, social media, forums, live streaming services, podcasts, online gaming, video‑on‑demand services, and e‑commerce sites.
A key component of the decree‑law is the establishment of the Child Digital Safety Council, chaired by the Minister of Family. This council will serve as a central advisory and coordinating body to align federal and local entities as well as private sector stakeholders in efforts to enhance digital safety for children. Its functions include proposing relevant policies, legislation and national strategies, recommending awareness campaigns, and conducting research to identify emerging digital risks linked to technological advancements.
Also Read: How Effective School Safety Policies Contribute to Student Well-Being
Under the new law, digital platforms will be required to implement robust measures to protect minors. These include default privacy settings tailored for children, age verification systems, tools to enforce age-appropriate access, content filtering and blocking mechanisms, age-rating systems, and restrictions on targeted advertising aimed at children. The decree also takes a strong stance on data protection, prohibiting the collection, processing, publication or sharing of personal data of children under the age of 13, unless explicitly permitted under strict conditions for educational or health-related purposes.
In addition, the legislation bans children from accessing or participating in online commercial games that involve gambling or betting activities. Internet service providers are similarly bound by obligations to implement content filtering and ensure safe and supervised internet usage. They are also expected to require parental consent for the integration of appropriate control tools.
The decree clarifies the responsibilities of child caregivers, who are expected to actively monitor children’s online activities, deploy parental controls, and ensure that children under their care only access platforms that meet enhanced safety standards. The Ministry of Family and relevant local authorities are tasked with developing mechanisms to support and enforce these caregiver obligations, as well as establishing clear procedures for reporting harmful digital content and ensuring swift action against online abuse or exploitation of minors.