Removing Article 19 of the Civil Rights Framework for the Internet Would Threaten Small Digital Communities and Reinforce the Monopoly of Big Tech

image by: James Wainscoat / Unsplash

The Supreme Federal Court (STF) of Brazil will judge in the coming days the constitutionality of Article 19 of the Civil Rights Framework for the Internet, a topic that has generated intense debates in society. There are 4 lawsuits questioning the validity of the provision, including a Direct Action of Unconstitutionality (ADI) and an Allegation of Breach of Fundamental Precept (ADPF).

What does the law say?

Article 19 of the Civil Rights Framework for the Internet establishes that:

Art. 19. Com o intuito de assegurar a liberdade de expressão e impedir a censura, o provedor de aplicações de internet somente poderá ser responsabilizado civilmente por danos decorrentes de conteúdo gerado por terceiros se, após ordem judicial específica, não tomar as providências para, no âmbito e nos limites técnicos do seu serviço e dentro do prazo assinalado, tornar indisponível o conteúdo apontado como infringente, ressalvadas as disposições legais em contrário.

Translation: Art. 19. In order to ensure freedom of expression and prevent censorship, internet application providers may only be held civilly liable for damages resulting from third-party content if, after a specific court order, they do not take the necessary measures to, within the scope and technical limits of their service and within the specified timeframe, make the content identified as infringing unavailable, except as otherwise provided by law.

Source: Law 12.965, of April 23, 2014

The problem of content moderation

Article 19 has been recently questioned in society due to the insufficient content moderation by large “platforms” (although I consider this term inappropriate, it has become common usage). It is evident that they fail considerably in removing criminal content posted on their networks and disinformation. One of the reasons is that the areas of these companies responsible for content moderation have been intense targets of the mass layoffs carried out since last year, greatly reducing their ability to act.

Thus, the argument that these platforms need to be regulated is perfectly valid, given the damage that the widespread dissemination of disinformation and criminal content has caused to society and democracy.

Nevertheless, it would be a mistake to think that content moderation is a simple task or can be done with little effort. The reader is invited to play the Moderator Mayhem game, which can be played directly in the browser, without the need for installation. The player is placed in the role of a platform content moderator and is presented with increasingly difficult situations where they must decide which content should be kept and which should be suppressed. The choices are not always clear and easy. Whoever thinks they are, I suggest you play the game and then tell me if you still have the same opinion.

The conception of the Civil Rights Framework

The Civil Rights Framework for the Internet in Brazil was conceived as the result of a widely participatory process, listening to the voices and arguments of society at large, from the draft bill stage. An open platform was used for the participation of any person

The draft bill of the Civil Rights Framework for the Internet was prepared in an innovative way, using the CulturaDigital.Br platform of the Ministry of Culture. The use of an existing platform facilitated the work of the SAL and was essential for the creation of the draft bill as it was done.

Source: Wikipedia

in such an innovative and participatory way that it was studied in a master’s dissertation.

Article 19 of the MCI was elaborated from an intense multi-sectoral debate, which included spaces for broad participation of civil society, government and the private sector. (…)

In other words, Article 19 determined that the final word on what is or is not lawful on the platforms is always up to the judiciary, as these companies cannot be held liable for third-party content if they do not fail to comply with a court order to remove it. They are free to adopt their own rules and content moderation operations, but they will not be required to pay compensation for not meeting an extrajudicial demand from a user.

Source: João Pedro Favaretto Salvador and Tatiane Guimarães in an article for the Getúlio Vargas Foundation

The intense debates also continued during the legislative process of the Bill in the houses that make up the National Congress.

After being sanctioned, the Civil Rights Framework for the Internet Law became a model for the world in regulating the internet, with other countries starting to be inspired by it as an example when discussing how to develop their own legislation.

What would happen without Article 19?

If the STF decides to annul Article 19 of the Civil Rights Framework for the Internet, any person who maintains an internet service where third parties can post content could, in theory, be held civilly liable for damages caused by third-party content. This applies to both the big platforms of the Big Tech companies (or, as they are called in the “Tech Won’t Save Us” podcast, the data vampires), and small forums and websites.

In the case of Big Tech, they would be able to bear the increased costs by rehiring and expanding their moderation teams (called “Trust andSafety”). They would likely configure their automated moderation algorithms to be more restrictive, which could amplify something that already occurs, which is the unjustified removal of perfectly legal content, the so-called false positive. This is something that has not received much attention lately, given the impact on society of false negatives (when illegal content is kept on the platform even after being reported). In addition, these companies have large teams of well-paid lawyers to handle any third-party content litigation.

In the end, however, the Big Tech come out on top because, despite the higher costs, the lack of legal protection against liability for third-party content is a legal environment that would prevent any smaller, innovative competitor, but with less financial resources, from appearing to take their place. That would consolidate the oligopoly of the few currently existing giants and would further concentrate the market. It is no coincidence that innovation by Big Tech has not been seen for a long time. Instead, they seem more focused on lobbying for a very specific type of state regulation that will further consolidate their position and prevent the emergence of new competitors.

This has also been a recurring concern in discussions in other countries. In the European Union, the Digital Services Act, established in 2022, sets clearly distinct rules for large online platforms. In the US, Mike Masnick argued in a similar discussion about the possible repeal of Section 230 of the Communications Decency Act that has been raised by Congress:

“Big Tech” is absolutely willing to compromise on Section 230, because they know that all it does is play into their hands. It’s all the other sites that get screwed because of litigation and liability. Meta and Google and the other big tech companies have buildings full of lawyers. Removing Section 230 may harm them at the margins, but they’ll make up for it by having all the smaller competition wiped out.

Source: Techdirt, May 2024

On the other hand, in the case of small sites and forums, it is very different. Often, they have no source of income and are only a source of expenses, effort and work for the person who maintains them. The mere possibility of having to bear the risk of a lawsuit due to a third-party post can lead many to consider closing their activities, considering the insufficiency of resources to quickly monitor everything written by third parties and to defend themselves in legal proceedings, as well as the non-commercial nature of these spaces.

Therefore, it is essential that Article 19 of the Civil Rights Framework for the Internet is not completely revoked, but rather that specific provisions are added for specific situations applicable only to large platforms. In addition to the European regulation already in force, something similar has also been defended here in Brazil by experts like Ronaldo Lemos, who argued during a public hearing at the STF in 2023:

Regarding Article 19, my personal view is that instead of revoking it due to unconstitutionality, the best path is actually to modulate its application, providing for specific situations different from its general rule, again through the National Congress.

Source: Voices of Regulation - Public Hearing on 3/28/2023

The future of small digital communities

If Article 19 completely loses its validity, the risks of anyone maintaining a small digital community or non-profit forum on the internet with the possibility of hosting third-party content will become enormous. It would be regrettable to lose all the content and knowledge base accumulated in these spaces. Therefore, in the case of liability for third-party content, the sensible alternative that preserves this content base could be to freeze the community or forum and not allow any new posts, leaving them only as a source of consultation, in the hope of helping those seeking information. However, unfortunately, losing its interactive characteristic and the possibility of one person directly helping another, for example, by answering questions.

Despite the challenges, it is essential to find a balanced solution that preserves freedom of expression and innovation on the internet, while establishing clear and specific rules for large platforms to mitigate the harm caused by the spread of harmful content, without harming small websites. The regulation that is intended to be achieved must necessarily apply exclusively to large platforms, as the European Union has already done.