In historic ruling, companies may now face damages if they fail to remove criminal posts
06/29/2025
The Supreme Federal Court (STF) ruled on Thursday (26), by 8 votes to 3, to expand the legal liability of social media platforms for content posted by users. Companies will be required to act proactively to remove posts containing “serious crimes” and could face civil penalties if they fail to do so.
The court declared partially unconstitutional Article 19 of the Internet Civil Framework, which stated that tech companies could only be held civilly liable if they disobeyed a court order to remove content. Going forward, this rule will apply only to crimes against honor, such as libel, slander, and defamation.
Voting in favor of expanding platform liability were Justices Dias Toffoli and Luiz Fux, who served as the case rapporteurs, along with Luís Roberto Barroso, Flávio Dino, Cristiano Zanin, Gilmar Mendes, Alexandre de Moraes, and Cármen Lúcia. Dissenting were Justices André Mendonça, Edson Fachin, and Nunes Marques, the last to cast his vote.
Instead of Article 19, Article 21 of the Internet Civil Framework now applies to criminal content. This provision requires platforms to remove posts after being notified. If they fail to do so and a court deems the content criminal, the platform may be held liable. Originally, Article 21 applied only to non-consensual nudity; the STF has now expanded it to cover any type of crime or unlawful act.
The court also ruled that platforms may be held liable, even without notification or a court order, if they host or promote paid ads featuring unlawful content or maintain fake bot accounts.
Additionally, the STF established that platforms must fulfill a “duty of care” concerning “serious crimes.” If they fail to remove such posts on their own, they may be subject to civil penalties—even without prior notice or a judicial ruling.
Listed as “serious crimes” are: anti-democratic acts; terrorism or preparatory acts; incitement, assistance, or encouragement of suicide or self-harm; incitement to discrimination based on race, color, ethnicity, religion, national origin, sexuality, or gender identity; crimes against women; sexual offenses against vulnerable individuals; child pornography; crimes against children and adolescents; and human trafficking.
According to the ruling, platforms will not be penalized for isolated serious crimes unless notified and they fail to act. Civil liability will apply when there is a high volume of such content and the platform takes no action. In practice, the decision forces companies to adopt active monitoring practices to curb criminal behavior on their networks.
Justices also required platforms to adopt self-regulation policies that include “notification systems, due process mechanisms, and annual transparency reports on extrajudicial takedown requests, ads, and content promotion.” Companies must also provide dedicated user-support channels.
The court determined that companies operating in Brazil must establish and maintain a local headquarters and legal representative. Marketplaces will be held liable under the Consumer Protection Code. Messaging services such as WhatsApp remain subject to Article 19, but only in relation to interpersonal communications.
Fake profile case triggered ruling
The STF analyzed two cases. Justice Dias Toffoli was the rapporteur in a case examining the constitutionality of Article 19, which involved a fake Facebook profile. Justice Luiz Fux reported on a case concerning platform liability for third-party content; it involved a ruling ordering Google to remove a community from the now-defunct Orkut.
Mr. Toffoli, the first to vote, proposed applying Article 21 instead of Article 19 in December of last year. Mr. Fux supported holding platforms liable when they have “clear knowledge of unlawful acts” and fail to act. He cited hate speech, racism, pedophilia, incitement to violence, advocacy of the violent overthrow of the democratic state, and support for coups as examples.
Chief Justice Luís Roberto Barroso proposed that Article 19 be retained only for honor-related offenses. For all other crimes, he argued Article 21 should apply. His position prevailed.
The dissenters—Justices André Mendonça, Edson Fachin, and Nunes Marques—argued that Article 19 should be fully preserved.
Tech firms raise concerns
In a statement, Attorney General Jorge Messias called the ruling “historic.” “It is a civilizational milestone and aligns with measures adopted by other democratic countries aimed at better protecting society from crimes, fraud, and hate speech that threaten citizens and democracy in the digital environment,” he said.
Google, a party in the case reported by Mr. Fux, said it is still reviewing the decision. “Google has expressed concerns about changes that may impact free speech and the digital economy. We are evaluating the approved ruling, especially the expansion of takedown obligations under Article 21, and their impact on our products. We remain open to dialogue.”
Meta, parent company of Facebook and Instagram and a party in the case under Mr. Toffoli, expressed concern in a statement. “Weakening Article 19 of the Internet Civil Framework introduces legal uncertainty and will have consequences for free expression, innovation, and digital economic growth, significantly increasing the risk of doing business in Brazil.”