Click to copy
Presented in early May by Senator Alessandro Vieira (CITIZENSHIP / SE), bill 2630 proposes the establishment of the Brazilian Law on Freedom, Responsibility and Transparency on the Internet, one of the main objectives of which is to combat the mass sending of disinformation content .
According to the initial document, national and international social networking platforms with an audience greater than two
millions of users would need to comply with new practices to avoid the
dissemination of false news, in addition to updating weekly
information such as the number of profiles excluded and the justification for
such an action. Warnings, fines or even prohibition of service can
consequence for companies that break these rules.
The PL, however, has been the target of criticism by organizations related to the web environment, both for suggesting changes in the Marco Civil da Internet and because of the lack of clear and detailed indicators on what is disinformation content can lead to platforms to delete legitimate posts to avoid the risk of breaking the law.
On this second (1) the Mixed Parliamentary Front of the Economy and Digital Citizenship, also called the Digital Front, held an online event to debate the issue of transparency and accountability of the intermediary if the project, which is due to be voted on until Tuesday (2), be approved.
Rush can be the enemy of good intent
With the mediation of journalist Pedro Dória, the panel was attended by Senator Alessandro Vieira (CIDADANIA / SE), author of PL 2630; Vinicius Poit (NOVO / SP); Bia Barbosa (Rights Coalition on the Network), Laura Moraes (Avaaz) and Monica Rosina (Facebook).
Author of the bill, Vieira says that the measure was proposed to give more transparency to the processes to classify content as false and also provide the possibility of defense. "Already
there is an active protection policy for platforms, but it is
unilateral, does not guarantee prior notification or the right of defense and
For Bia Barbosa, from Coalizão Direitos na Rede, the responsibility of the platforms in checking and possible removal of facts needs to revolve around three central issues:
- Ensure clear and conspicuous mechanisms for receiving reports of misinformation: have a usability that makes it clear to the user that he can make a report, and not "hide" the option, making classification work more difficult;
- Ensure that the person who had the content reports has the right to protest against categorization;
- And guarantee the transparency of the complaint process.
Even agreeing with the need for an adjustment in the disinformation management made by the social network platforms, Barbosa believes that the approval of the project in its current state can harm the discussion.
"We are very concerned that tomorrow we will have a vote on a text that we do not know (the bill is being searched by the rapporteur by Senator Angelo Coronel (PSD / BA)), but we do not know if these incorporations will be accepted by the rapporteur", explains panelist. "It is a very serious problem from the point of view of overriding the democratic debate."
Discussion needs to be longer, but it has to be efficient
Federal deputy in São Paulo, Vinicius Poit (NOVO) also argues that the discussion about the law should take longer and deeper than that established by the current agenda. "Even though some countries like Germany and France are reviewing the issue of the code of conduct (on social platforms, I think that all the effort for dialogue and building something with a little bit of everyone's opinion is valid at this moment".
Monica Rosina, from Facebook, said that this agenda is seen in a very relevant way within the company. "Facebook recognizes the problem of fake news as one of the most relevant issues internally, and has been working on preventing abuse."
The professional reported that, between January and March this year, the platform eliminated 1.7 billion fake accounts. Regarding the disclosure of misinformation, he explained that the brand acts on two pillars: removal, when the content violates the coexistence rules established by the company, and reduction, in which the system reduces the reach of a text classified as false by up to 80%, in addition to making this flag in the post itself.
"Facebook does not want to pose itself as the owner of the truth, so it reduces the reach, but does not remove the content," he explains.
Laura Moraes, from Avaaz, believes that social companies also need to do more educational work to increase public awareness of the topic. "We welcome the progress made in creating mechanisms, but it is still not fast and efficient enough."
Moraes reinforces the need to pass a law in this sense, to structure this theme in a cohesive way in society.
“Even with the European Union's code of conduct, made in partnership with the platforms, it is being revised. But it is not because Europe is not doing it (its legislation) that we cannot do it. You can't wait five years (the time it takes to draw up the Marco Civil da Internet) to see what happens. ”