Facebook Ignores Brazil Election Misinformation Ads

7 mins read

According to a recent analysis from Global Witness, Facebook continued a habit of failing to discover content that violates its standards, which the organization calls “alarming,” by failing to detect blatant election-related disinformation in advertising prior to Brazil’s 2022 election.

The advertising promoted the inaccurate election date, used the wrong voting procedures, and cast doubt on the fairness of the vote, including Brazil’s electronic voting system, according to The Associated Press, which obtained the advertisements.

The London-based group has tested Meta’s capacity to identify flagrant breaches of the policies governing its most popular social media platform four times, and each time Facebook has failed the test. In the three previous cases, Facebook’s controls—either human reviewers or artificial intelligence—were tested to determine if they would capture advertising that contained violent hate speech. No, they didn’t.

“Facebook has identified Brazil as one of its priority countries where it’s investing special resources specifically to tackle election-related disinformation,” said Jon Lloyd, senior advisor at Global Witness. “So we wanted to really test out their systems with enough time for them to act. And with the US midterms around the corner, Meta simply has to get this right — and right now.”

On October 2, Brazil will hold its national elections amidst high levels of unrest and misinformation that threatens to undermine the electoral process. The most widely used social media network in the nation is Facebook. In a statement, Meta claimed to have “exhaustively prepared for the Brazilian election in 2022.”

“We’ve launched tools that promote reliable information and label election-related posts, established a direct channel for the Superior Electoral Court (Brazil’s electoral authority) to send us potentially-harmful content for review, and continue closely collaborating with Brazilian authorities and researchers,” the company said.

In 2020 Facebook began requiring advertisers who wish to run ads about elections or politics to complete an authorization process and include “paid for by” disclaimers on them, similar to what it does in the US The increased safeguards follow the 2016 US presidential elections, when Russia used rubles to pay for political ads designed to stoke divisions and unrest among Americans.

When it filed the test advertising, according to Global Witness, it violated these guidelines (which were approved for publication but were never actually published). The fact that the organization posted the advertisements from Nairobi and London rather than from Brazil should have rung alarm bells.

Additionally, the advertisements were not obliged to include a “paid for by” disclaimer and were not paid for using a Brazilian payment method, both measures that Facebook claims it has in place to stop malevolent actors from abusing its platform to meddle in elections throughout the world.

“What’s quite clear from the results of this investigation and others is that their content moderation capabilities and the integrity systems that they deploy in order to mitigate some of the risk during election periods, it’s just not working,” Lloyd said.

The group is using ads as a test and not regular posts because Meta claims to hold advertisements to an “even stricter” standard than regular, unpaid posts, according to its help center page for paid advertisements.

But judging from the four investigations, Lloyd said that’s not actually clear.

“We we are constantly having to take Facebook at their word. And without a verified independent third party audit, we just can’t hold Meta or any other tech company accountable for what they say they’re doing,” he said.

Global Witness submitted ten ads to Meta that obviously violated its policies around election-related advertising. They included false information about when and where to vote, for instance and called into question the integrity of Brazil’s voting machines — echoing disinformation used by malicious actors to destabilize democracies around the world.

In another study carried out by the Federal University of Rio de Janeiro, researchers identified more than two dozen ads on Facebook and Instagram, for the month of July, that promoted misleading information or attacked the country’s electronic voting machines.

The university’s internet and social media department, NetLab, which also participated in the Global Witness study, found that many of those had been financed by candidates running for a seat at a federal or state legislature.

This will be Brazil’s first election since far-right President Jair Bolsonaro, who is seeking reelection, came to power. Bolsonaro has repeatedly attacked the integrity of the country’s electronic voting system.

“Disinformation featured heavily in its 2018 election, and this year’s election is already marred by reports of widespread disinformation, spread from the very top: Bolsonaro is already seeding doubt about the legitimacy of the election result, leading to fears of a United States-inspired January 6 ‘stop the steal’ style coup attempt,” Global Witness said.

In its earlier investigations, the group discovered that Facebook failed to stop hate speech in Myanmar, where ads used a slur to refer to people of East Indian or Muslim descent and called for their deaths; in Ethiopia, where the ads used dehumanizing hate speech to call for the murder of people belonging to each of Ethiopia’s three main ethnic groups; and in Kenya, where the ads discussed beheadings, rape, and bloodshed.

The source used to create the news: https://english.aawsat.com

FİKRİKADİM

The ancient idea tries to provide the most accurate information to its readers in all the content it publishes.


Fatal error: Uncaught TypeError: fclose(): Argument #1 ($stream) must be of type resource, bool given in /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php:2386 Stack trace: #0 /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php(2386): fclose(false) #1 /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php(2146): wp_cache_get_ob('<!DOCTYPE html>...') #2 [internal function]: wp_cache_ob_callback('<!DOCTYPE html>...', 9) #3 /home/fikrikadim/public_html/wp-includes/functions.php(5420): ob_end_flush() #4 /home/fikrikadim/public_html/wp-includes/class-wp-hook.php(324): wp_ob_end_flush_all('') #5 /home/fikrikadim/public_html/wp-includes/class-wp-hook.php(348): WP_Hook->apply_filters('', Array) #6 /home/fikrikadim/public_html/wp-includes/plugin.php(517): WP_Hook->do_action(Array) #7 /home/fikrikadim/public_html/wp-includes/load.php(1270): do_action('shutdown') #8 [internal function]: shutdown_action_hook() #9 {main} thrown in /home/fikrikadim/public_html/wp-content/plugins/wp-super-cache/wp-cache-phase2.php on line 2386