report-download-button

Download the report

Since at least a decade ago, human rights institutions have acknowledged the enabling potential of the internet to realize a range of human rights. Digital technologies have been incredibly transformative tools for allowing people to speak out against arbitrary acts of public and private powers, empowering the expression of historically vulnerable, marginalized and silenced groups, catalyzing civic organization and participation, and facilitating innovative ways to collectively build and share knowledge. Since then, the right to seek, receive, and impart information has enabled the exercise of other rights and strengthened the internet ecosystem, but not without backlashes and critical challenges.

The current discussion about platform regulation in Brazil, both in the draft bill known as “PL 2630”  and in constitutional cases pending in the country’s Supreme Court, demonstrates that much effort is going into addressing these challenges, but also shows that proper responses are not simple to craft. We should be able to tailor these responses safeguarding the positive potential of digital technologies and the essential role freedom of expression, including access to information, plays in preserving democratic societies.

Quick Background

The PL 2630, also known as the “Fake News Bill”, was first introduced in the Brazilian Senate in 2020. The push from civil society organizations and coalitions, such as Coalizão Direitos na Rede, to improve the text and their work with the bill’s rapporteur in the Chamber of Deputies were critical to neutralize threats like the traceability mandate of end-to-end encrypted messages. By then, Brazilian digital rights groups had also stressed that the regulation should focus on content moderation processes (e.g. transparency and due process rules) rather than restriction of certain types of content. After the release of a new draft text in early 2022, the bill remained halted in Brazil’s Chamber of Deputies until the beginning of 2023.

Following the failed attempt earlier this year of the far right to overthrow the new administration of President Lula da Silva and a peak of violent attacks in Brazilian schools, PL 2630 has consolidated its position as the legislative path to address more comprehensive concerns on the use of digital technologies in contexts of social unrest. For that, the Executive branch proposed to the bill’s rapporteur a new text that introduced several changes, looking at laws like the German NetzDG, the EU Digital Services Act (DSA), and draft legislations such as the controversial UK Online Safety Bill .

The latest published version of the bill incorporates some of these proposals, such as risk assessment rules, duty of care obligations, and new exceptions to Brazil’s general rule to online intermediary liability. According to Article 2 of the bill, it applies to social networks, search mechanisms, and instant messaging services constituted as a legal entity and with more than ten million monthly users in Brazil. Although the DSA is often mentioned as an inspiration and democratic precedent grounding the new proposal, the revamped bill has important differences and still fails to ensure checks and balances considering the Brazilian context and institutional framework.  

In parallel, the country's Supreme Court has pending cases about online intermediary liability (general repercussion issues 533 and 987) and the blocking of websites and applications by judicial authorities (ADI 5527 and ADPF 403). Currently, the general online intermediary liability regime in Brazil is set by Article 19 of Law n. 12.965/2014, known as Marco Civil da Internet. According to Article 19, internet applications can be held liable for user content only when they fail to comply with a judicial decision ordering the removal of infringing content. There are exceptions where an extrajudicial notice can make platforms liable for third-party content. They are copyright infringement, unauthorized disclosure of private images containing nudity or sexual activity, and content involving child sexual abuse.

Some Supreme Court's justices have expressed their opinion that Marco Civil's general regime needs an update to stiffen online intermediary rules, and the pending constitutional cases may be a way to do so if Congress does not address the issue in a timely manner. While the increasingly powerful role of major internet applications has prompted debates and initiatives to review current intermediary liability regimes across geographies, there are key questions we must ask, tools we should consider, and lessons learned to build on before introducing changes that can seriously impact protected expression and people's ability to strengthen their voices and rights using digital technologies.

In turn, the Supreme Court's ruling on blocking of websites and applications has been halted since 2020, when Justice Alexandre de Moraes requested the file for review, returning it only in March this year. These cases refer to WhatsApp blockings in Brazil in 2015 and 2016, involving the issue of whether authorities could require an internet application to undermine its privacy and security features by design, i.e., end-to-end encryption, to disclose user communications data within a criminal investigation. The ruling started in 2020 with key Justices' votes supporting privacy and security protections inscribed in digital systems' architecture and rejecting interpretation of Brazilian law to allow state-ordered blocking aimed at impairing such protections. Unfortunately, the possible outcomes of resuming this ruling in the current context is unpredictable. Following its pioneering role of recognizing personal data protection as a fundamental right in Brazil's Constitution, it's crucial that the Supreme Court endorses Justices Rosa Weber and Edson Fachin votes in favor of robust privacy and security by design. 

Despite moves from the Executive branch and the Supreme Court for changes in Brazil's current legal framework, political actors agreed, at least for now, that Congress is the proper venue for a democratic debate on platform regulation. We agree. It's relevant, then, to look into the draft law under discussion. While it contains positive elements, we must also highlight points yet to be improved.

Important Points of Concern

The PL 2630 purports to strengthen users' rights in face of the power of large internet applications, like Facebook, Youtube, and Twitter. Yet, there are crucial points of concern that Brazil’s regulation debate and PL 2630 should carefully tackle. Other groups in the region, like Derechos Digitales, have raised points of attention. As we further elaborate on this piece, there is a set of issues that stakeholders must consider and address before passing a new law. The most relevant are: 

  • Neutralize risks of abuse of content-based regulations, dropping duty of care obligations, focusing on systemic impact assessments, and making it explicit that platform accountability doesn’t mean general monitoring and filtering of user content.
  • Ensure robust checks, balances, and due process safeguards for the application of specific rules to situations of conflict and imminent risk.
  • Carefully design and ensure adequate means to establish a proper independent, autonomous, participative, and multi-stakeholder oversight structure for the upcoming regulation.
  • Establish clear safeguards against increasing surveillance and related security risks. 
  • Refrain from giving special speech protections to government officials, who bear special responsibilities under human rights standards. 
  • Ensure sanctions in accordance with human rights standards and due process guarantees, particularly when it involves blocking online applications.

The last point refers to the administrative penalties that may apply in case internet applications within the scope of the bill fail to comply with its rules. The "temporary suspension of activities" is among this list of penalties. In practice, this means that a government administrative authority would have the power to block an entire website or app. Website blocking in Brazil generally happens following a judicial order, although the Ministry of Justice has recently stated that consumer administrative bodies would have this authority as per traditional suspension penalties set in consumer law. Human rights standards indicate that blocking of entire websites and applications is an extreme measure with technical challenges, great risks of abuse, and significant impacts on fundamental rights. In 2021, the UN Human Rights Council reiterated the adoption of a resolution unequivocally condemning the use of internet shutdowns and online censorship, which includes social-media shutdowns, to arbitrarily prevent or disrupt access to or dissemination of information online. We highlighted such concerns in the context of PL 2630. And while in previous versions of the bill only an absolute majority of a judicial collegiate body could apply this blocking penalty, the current draft gives this power to an unspecified administrative authority. Brazilian lawmakers should acknowledge the dangers of the arbitrary use of online blockings and step back.

Additionally, the legitimate enforcement of possible sanctions is closely tied to the bill's set of rules and oversight structure. The other points of concern we mention above highlight relevant remaining gaps on this front. We elaborate on them in the next section.

From 2011 to 2023: Address Current Challenges Building on Existing Principles and Safeguards

Since the 2011 Joint Declaration on Freedom of Expression and the Internet of Special Rapporteurs for Freedom of Expression, human rights institutions have underscored that government initiatives seeking to regulate online communications should preserve and adapt to the unique characteristics of the internet. This is for these initiatives to be both effective and respect internet features enabling fundamental rights and freedoms. Any restrictions must follow the "three-part test", that is, they must be clearly set by law, strictly necessary and proportionate to achieve a legitimate aim in a democratic society. Important concerns around internet fragmentation, collateral censorship, over-removal of legitimate expression, and more recently, inherent intricacies of content moderation at scale, have led experts throughout the years to avoid content-specific regulations. The risks of arbitrary application and interpretation of content restrict rules in nondemocratic or conflictive settings add other layers to this set of concerns.

We detail our remaining points of concern in the sections below (or read the full PDF here):

Concerning Duty of Care Obligations

Repel Rules and Interpretations That Can Lead to Content Monitoring Obligations

Robust Checks, Balances, and Due Process Safeguards for Exceptional Measures in Crisis Situations

Proper Independent and Participative Oversight Structure

Clear Safeguards Against Incrementing Surveillance and Related Security Risks

Review Problematic Immunity for Public Officials

Conclusion

Any laws seeking to strengthen users' rights in the face of dominant internet applications should build on these principles and safeguards instead of ruling them out. We will not be able to offer responses to challenges arising from the constant but ever-changing interrelation between digital technologies and society if we disregard relevant settled bases, grounded in human rights standards, at each step of this way. Empowering users before dominant internet platforms' huge corporate power also involves more structural and economic measures that are mainly missing from the current debate, such as fostering interoperability of social networks. We hope the concerns and principles we articulate here can contribute to the debate currently underway in Brazil.