• Twitter
  • Facebook
  • LinkedIn
  • English English English en
  • Português Português Portuguese (Brazil) pt-br
Murray Advogados
  • Home
  • The Firm
  • Areas
    • More…
      • Probate and Family Law
      • Capital Stock
      • Internet & Electronic Trade
      • Life Sciences
      • Capital and Financial Market Banking Law
      • Media e Entertainment
      • Mining
      • Intellectual Property
      • Telecommunications Law and Policy
      • Visas
    • Arbitration
    • Adminstrative Law
    • Environmental Law
    • Civil Law
    • Trade Law
    • Consumer Law
    • Sports Law
    • Market and Antitrust Law
    • Real Estate Law
    • International Law and Foreign Trade
    • Corporate Law
    • Labor Law
    • Tax Law
    • Power, Oil and Gas
  • Members
  • News
  • Links
  • Contact
    • Contact Us
    • Careers
  • Search
  • Menu Menu
Murray News

Supreme Court expands big tech liability for user content

In historic ruling, companies may now face damages if they fail to remove criminal posts

 

 

06/29/2025

The Supreme Federal Court (STF) ruled on Thursday (26), by 8 votes to 3, to expand the legal liability of social media platforms for content posted by users. Companies will be required to act proactively to remove posts containing “serious crimes” and could face civil penalties if they fail to do so.

The court declared partially unconstitutional Article 19 of the Internet Civil Framework, which stated that tech companies could only be held civilly liable if they disobeyed a court order to remove content. Going forward, this rule will apply only to crimes against honor, such as libel, slander, and defamation.

Voting in favor of expanding platform liability were Justices Dias Toffoli and Luiz Fux, who served as the case rapporteurs, along with Luís Roberto Barroso, Flávio Dino, Cristiano Zanin, Gilmar Mendes, Alexandre de Moraes, and Cármen Lúcia. Dissenting were Justices André Mendonça, Edson Fachin, and Nunes Marques, the last to cast his vote.

Instead of Article 19, Article 21 of the Internet Civil Framework now applies to criminal content. This provision requires platforms to remove posts after being notified. If they fail to do so and a court deems the content criminal, the platform may be held liable. Originally, Article 21 applied only to non-consensual nudity; the STF has now expanded it to cover any type of crime or unlawful act.

The court also ruled that platforms may be held liable, even without notification or a court order, if they host or promote paid ads featuring unlawful content or maintain fake bot accounts.

Additionally, the STF established that platforms must fulfill a “duty of care” concerning “serious crimes.” If they fail to remove such posts on their own, they may be subject to civil penalties—even without prior notice or a judicial ruling.

Listed as “serious crimes” are: anti-democratic acts; terrorism or preparatory acts; incitement, assistance, or encouragement of suicide or self-harm; incitement to discrimination based on race, color, ethnicity, religion, national origin, sexuality, or gender identity; crimes against women; sexual offenses against vulnerable individuals; child pornography; crimes against children and adolescents; and human trafficking.

According to the ruling, platforms will not be penalized for isolated serious crimes unless notified and they fail to act. Civil liability will apply when there is a high volume of such content and the platform takes no action. In practice, the decision forces companies to adopt active monitoring practices to curb criminal behavior on their networks.

Justices also required platforms to adopt self-regulation policies that include “notification systems, due process mechanisms, and annual transparency reports on extrajudicial takedown requests, ads, and content promotion.” Companies must also provide dedicated user-support channels.

The court determined that companies operating in Brazil must establish and maintain a local headquarters and legal representative. Marketplaces will be held liable under the Consumer Protection Code. Messaging services such as WhatsApp remain subject to Article 19, but only in relation to interpersonal communications.

Fake profile case triggered ruling

The STF analyzed two cases. Justice Dias Toffoli was the rapporteur in a case examining the constitutionality of Article 19, which involved a fake Facebook profile. Justice Luiz Fux reported on a case concerning platform liability for third-party content; it involved a ruling ordering Google to remove a community from the now-defunct Orkut.

Mr. Toffoli, the first to vote, proposed applying Article 21 instead of Article 19 in December of last year. Mr. Fux supported holding platforms liable when they have “clear knowledge of unlawful acts” and fail to act. He cited hate speech, racism, pedophilia, incitement to violence, advocacy of the violent overthrow of the democratic state, and support for coups as examples.

Chief Justice Luís Roberto Barroso proposed that Article 19 be retained only for honor-related offenses. For all other crimes, he argued Article 21 should apply. His position prevailed.

The dissenters—Justices André Mendonça, Edson Fachin, and Nunes Marques—argued that Article 19 should be fully preserved.

Tech firms raise concerns

In a statement, Attorney General Jorge Messias called the ruling “historic.” “It is a civilizational milestone and aligns with measures adopted by other democratic countries aimed at better protecting society from crimes, fraud, and hate speech that threaten citizens and democracy in the digital environment,” he said.

Google, a party in the case reported by Mr. Fux, said it is still reviewing the decision. “Google has expressed concerns about changes that may impact free speech and the digital economy. We are evaluating the approved ruling, especially the expansion of takedown obligations under Article 21, and their impact on our products. We remain open to dialogue.”

Meta, parent company of Facebook and Instagram and a party in the case under Mr. Toffoli, expressed concern in a statement. “Weakening Article 19 of the Internet Civil Framework introduces legal uncertainty and will have consequences for free expression, innovation, and digital economic growth, significantly increasing the risk of doing business in Brazil.”

*By Tiago Angelo and Isadora Peron, Valor — Brasília
Source: Valor International
https://valorinternational.globo.com/
29 de June de 2025/by Gelcy Bueno
Tags: big tech liability for user content, Supreme Court
Share this entry
  • Share on Facebook
  • Share on Twitter
  • Share on WhatsApp
  • Share on LinkedIn
  • Share by Mail

Pesquisa

Posts Recentes

  • Brazil wants to include cars and sugar in Mercosur
  • Supreme Court expands big tech liability for user content
  • Political rift in Brasília stalls income reform plan
  • Tax Reform in Brazil – Impacts for Foreign Companies
  • Debenture offerings surpass R$2.2bn after holiday break

Arquivos

  • June 2025
  • May 2025
  • April 2025
  • March 2025
  • February 2025
  • January 2025
  • December 2024
  • November 2024
  • October 2024
  • September 2024
  • August 2024
  • July 2024
  • June 2024
  • May 2024
  • April 2024
  • March 2024
  • February 2024
  • January 2024
  • February 2023
  • January 2023
  • December 2022
  • November 2022
  • October 2022
  • September 2022
  • August 2022
  • July 2022
  • June 2022
  • May 2022
  • April 2022
  • March 2022
  • February 2022
  • January 2022
© Copyright 2023 Murray Advogados – PLG International Lawyers - Support Webgui Design
  • Twitter
  • Facebook
  • LinkedIn
Political rift in Brasília stalls income reform plan Brazil wants to include cars and sugar in Mercosur
Scroll to top