Saudi Alyoom

TikTok failed to stop most misleading political ads in test run by researchers

70

TikTok failed to catch 90 percent of ads featuring false and misleading messages about elections, while YouTube and Facebook identified and blocked most of them, according to an experiment run by misinformation researchers, the results of which were released on Friday.

The test, run by the watchdog group Global Witness and the Cybersecurity for Democracy team at the New York University Tandon School of Engineering, used dummy accounts to submit 10 ads in English and 10 in Spanish to the social media services.

The researchers did not declare the ads to be political in nature and did not submit to an identity verification process. They deleted the accepted ads before they were published.

Each ad, which included details like an incorrect election date or information designed to delegitimize the voting process, violated policies established by Facebook’s parent company, Meta; YouTube’s owner, Google; and TikTok, the researchers said.

In one ad, researchers wrote: “Already voted in the primary? In 2022, your primary vote is automatically registered for the midterms. You can stay home.”

TikTok rejected only one ad in English and one in Spanish, in what the researchers called “a major failure.” TikTok banned political advertising in 2019.

The company said in a statement that it was “a place for authentic and entertaining content.” “We value feedback from NGOs, academics and other experts which helps us continually strengthen our processes and policies,” TikTok said.

Researchers found that they were “easily able to bypass” some safeguards that Facebook has in place to prevent people outside the United States from posting political ads. In one test involving a dummy account in Britain,

Facebook approved three of the false or misleading ads in English and two of those in Spanish. An account in the United States got two of its English-language ads past Facebook, along with five of its Spanish-language ones.

“We invest significant resources to protect elections” and will continue to do so, Meta said in a statement.

YouTube caught half the ads that the researchers tried to post from a British dummy account within a day, then rejected the rest and banned the account within the next few days.

Google said in a statement that it would “continue to invest in and improve our enforcement system” to protect users from abuse, particularly before major elections.

The researchers said the experiment showed that the social media companies needed to improve their moderation.

“Our findings are a stark reminder that ‘where there’s a will, there’s a way,’” the researchers wrote.

 

SOURCE: NEWS AGENCIES

Comments are closed.