smartphone-com-icones-de-contato-holograficos-na-mao

The debate over social media regulation in Brazil is at the center of legal, political, and social discussions, especially in light of Article 19 of the Brazilian Internet Bill of Rights (Law 12.965/2014). The issue involves fundamental questions about freedom of expression, the responsibility of digital platforms, and the boundaries between regulation and censorship.

Article 19 of Law 12.965/2014 establishes that internet application providers, such as social networks, can only be held civilly liable for damages caused by third-party content if, after a specific court order, they fail to take action to remove the offensive content. In other words, the general rule is that there is no obligation to remove content without a judicial decision, except in legal exceptions (such as child pornography and copyright infringement).

The purpose of the article is to protect freedom of expression and prevent censorship, avoiding situations where platforms remove content due to extrajudicial pressure or subjective criteria, which could restrict public debate and the circulation of ideas.

Supporters of Article 19 emphasize that requiring a court order for content removal is a fundamental safeguard for freedom of expression in the digital environment, serving as an effective counterbalance against arbitrary censorship practices, whether by the state or private entities.

Some critics argue that the Supreme Federal Court (STF), by revisiting the liability regime for digital platforms, may exceed its institutional limits, potentially creating a censorship-prone environment. This risk becomes more pronounced with the possibility of relaxing Article 19 of the Internet Bill of Rights, which would allow content removal without a mandatory court order.

Here, the fear persists that excessive regulation could lead to automatic and preventive moderation of legitimate expressions, with platforms adopting overly restrictive policies as a self-protection mechanism against potential sanctions.

The STF is currently ruling on Extraordinary Appeal No. RE 1037396, which addresses the constitutionality of Article 19 of Law 12.965/2014 (Internet Bill of Rights), requiring a prior and specific court order for content removal to hold internet providers, websites, and social media app managers civilly liable for damages caused by third-party unlawful acts. Below is a summary of the votes already presented by the justices:

 

 

Justice Flávio Dino argued that platforms should be directly liable in cases of child pornography, incitement to suicide, crimes against the Democratic Rule of Law, and human trafficking. He proposed that the Attorney General’s Office act as a supervisory body until new legislation is enacted and suggested liability only in cases of “systemic failure,” meaning repeated tolerance of illegal content, not isolated incidents.

 

 

Justice Cristiano Zanin considered Article 19 insufficient to protect fundamental rights and advocated its partial unconstitutionality. For Zanin, platforms should be liable for “manifestly illegal” content, especially when they engage in algorithmic curation. He proposed liability after extrajudicial notification for social networks and, for neutral platforms, only after a court order.

 

 

Justice Dias Toffoli voted for the unconstitutionality of Article 19, arguing that platforms should remove illegal content upon extrajudicial notification, without a court decision. In his view, the current rule does not adequately protect fundamental rights given today’s digital risks.

 

 

Justice Luiz Fux, the case rapporteur, agreed with Toffoli, declared Article 19 unconstitutional, and went further: he argued that a simple notification from users or victims should create an immediate obligation for platforms to remove content, without a court order. He advocated immediate removal of content deemed illegal—such as hate speech, racism, pedophilia, and advocacy of coups—upon victim notification, without judicial intervention. Fux emphasized the platforms’ “duty of care” and suggested efficient complaint channels.

 

 

Chief Justice Luís Roberto Barroso adopted an intermediate position, voting for partial unconstitutionality of Article 19: he maintained the requirement of a court order only for crimes against honor (libel, slander) but supported direct platform liability in serious cases such as child pornography, incitement to suicide, human trafficking, terrorism, and attacks on the Democratic Rule of Law.

 

 

Justice André Mendonça cast the only dissenting vote, defending the full maintenance of Article 19, arguing that the rule is essential for freedom of expression and that any changes should be made by Congress. He also opposed the blocking of profiles by platform decision.

 

 

The majority of justices believe that, given legislative inaction, the Supreme Court must reconcile the Internet Bill of Rights with the Constitution, establishing a new liability standard for platforms until Congress legislates on the matter. Chief Justice Barroso stressed that the Court is not legislating but deciding concrete cases. However, this stance has been criticized by legal scholars who argue that Congress already deliberated by approving the Internet Bill of Rights and that not amending the law is also a legitimate decision.

With this majority, a new regulatory framework has emerged that could expand platform liability even without new legislation passed by Congress. This decision contradicts the position of the House of Representatives, which in 2023 chose not to vote on Bill 2630/2020, known as the “Censorship Bill” or “Fake News Bill,” keeping the Internet Bill of Rights unchanged.

Ultimately, changes to this regulatory framework—whether through judicial flexibility or overly restrictive legislative provisions—could lead to censorship practices by both the state and digital platforms.

The real challenge lies in building a regulatory model that safeguards fundamental rights, ensures due process, and simultaneously prevents state and private censorship, fostering a digital environment characterized by democracy and pluralism of voices.

Autor: Alfredo Chagas Chebel • email: alfredo.chebel@ernestoborges.com.br

DOES REGULATING SOCIAL MEDIA CONSTITUTE CENSORSHIP?

Responsável pela área

Technology and Startups

DOES REGULATING SOCIAL MEDIA CONSTITUTE CENSORSHIP?

Lawyers

Area of expertise

Related

Technology and Startups

back Icone Mais Direita