Brazil Regulates the Children and Adolescents Online Safety Act (Digital ECA)
In summary
On March 19, 2026, the Federal government published the implementing decree of the Children and Adolescents Online Safety Act (Digital ECA – Law No. 15,211/2025). The Decree also establishes the National Policy for the Promotion and Protection of the Rights of Children and Adolescents in the Digital Environment and creates the National Notification Screening Center, operated by the Federal Police, to centrally receive reports of digital crimes against minors. On the same date, a decree was also published to approve the new structure of the Brazilian Data Protection Agency (ANPD), consolidating its functional, technical, decision-making, administrative, and financial autonomy as a regulatory agency.
Key points of the Decrees
Two days after the Digital ECA entered into force, the Federal government issued decrees that complement and operationalize the new law. Decree No. 12,880/26 is the main regulatory instrument: with 54 articles, it details the obligations of digital service providers regarding prevention, protection, age assessment, inappropriate and prohibited content, and parental supervision, in addition to establishing the National Policy for the Promotion and Protection of the Rights of Children and Adolescents in the Digital Environment and creating the National Notification Screening Center within the Federal Police. Decree No. 12,881/26, in turn, approves the new structure of the ANPD, which will operate with full institutional autonomy as the sector regulator—effective as of April 8, 2026.
Among the main points of Decree No. 12,880/26, we highlight:
- Creation of the National Policy for the Promotion and Protection of the Rights of Children and Adolescents in the Digital Environment, to foster the full protection of children and adolescents with priority in the digital environment, develop guidelines and recommendations, among other measures;
- Obligation for providers of digital products or services aimed at children or adolescents, or likely to be accessed by them, to implement mechanisms to prevent excessive, problematic, or compulsive use—prohibiting mechanisms such as hiding natural stopping points or triggering new content without the user’s request;
- Prohibition on adopting manipulative, deceptive, or coercive practices in digital products or services aimed at, or likely to be accessed by, children and adolescents—such as making it difficult to cancel services or exploiting cognitive vulnerabilities by creating manufactured urgency, among others;
- For providers of generative artificial intelligence capable of interacting with users based on natural-language instructions, the Decree created transparency obligations, duties to prevent behavioral manipulation, requirements to assess algorithmic risk, and to implement safeguards to protect the development of children and adolescents;
- Changes to the criteria for age ratings of electronic games and applications;
- Differentiation between prohibited content and inappropriate for children and adolescents, with different obligations depending on the type of content. The Decree sets out a list of prohibited content, including weapons, ammunition and explosives, alcoholic beverages, smoking products, gambling, loot boxes, and pornographic content, among others;
- For services with editorial control or with licensed music or literary content, the Decree waives age assessment provided the service offers children’s accounts with suitable content and implements parental supervision with a blocking system. Providers of journalistic and sports content with editorial control are also exempt from age assessment;
- With respect to electronic games with loot boxes, providers must implement effective user age-verification mechanisms and prevent children and adolescents from accessing games with loot boxes, or alternatively make available to children and adolescents versions of such games without loot boxes. If the game fully restricts access to the loot box functionality by default, the decree waives age verification;
- In addition, Decree No. 12,880/26 regulates in fairly generic terms the age-assessment and age-verification regime applicable to digital products and services aimed at children and adolescents, leaving detailed regulation to the ANPD. The Decree sets criteria for age-assessment and age-verification mechanisms, such as proportionality to risk, data minimization, security, non-discrimination, transparency, and prohibition on using personal data for other purposes. It also provides that app stores and operating systems must provide, free of charge, limited age signals about users to providers of digital products or services—without disclosing the exact date of birth or user profiling data;
- Decree No. 12,880/26 also provides that advertising that exploits a child’s lack of judgment is deemed abusive, and that providers of digital products or services that offer advertising to children and adolescents must prevent profiling and the use of emotional analysis or augmented, extended, or virtual reality. The ANPD will further regulate mechanisms for providers to prevent or mitigate children’s and adolescents’ access or exposure to prohibited content;
- Regarding content produced by children and adolescents, the Decree provides that providers of digital products and services must require judicial authorization when the content is monetized or boosted and exploits the image or routine of children and adolescents;
- With respect to combating digital crimes against minors, Decree No. 12,880/26 creates the National Notification Screening Center (Screening Center), operated by the Federal Police, which will centralize the receipt, screening, and forwarding of reports of criminal content sent by platforms—covering sexual exploitation, abuse, grooming, and kidnapping of minors. Under the new Decree, platforms must remove the content immediately upon identifying it and notify the Screening Center, without the need for a prior court order, while preserving data for investigative purposes;
- The Decree also details reporting and transparency obligations for providers of digital products and services, with specific obligations for platforms with more than 1 million users under 18. The decree requires providers of digital products or services aimed at children and adolescents, or likely to be accessed by them, to carry out an impact assessment on children’s safety and health and to publish a summarized version of the report;
- Finally, it is important to note that Decree No. 12,880/26 refers several still-open points to the ANPD for regulation, including the technical requirements for age assessment, parental supervision parameters, and a gradual implementation schedule for the obligations;
- Decree No. 12,881/26, effective as of April 8, 2026, approves the new internal structure of the ANPD. The Authority will fully exercise its powers, with functional, technical, decision-making, administrative, and financial autonomy. Its powers expressly related to the Digital ECA include regulating and overseeing Law No. 15,211/2025; establishing minimum requirements for parental supervision mechanisms; defining age-verification standards; and applying the warning and fine sanctions provided for in the Digital ECA.
