On Wednesday, TikTok published a transparency report in which it stated that it had removed nearly 62 million videos from its platform for violating its guidelines, and more than seven million accounts of users suspected of being under age 13 during the first three months of the year.
The videos accounted for less than 1 percent of the total posted on the platform and fell under categories such as “Adult nudity and sexual activities, harassment and bullying and hateful behavior,” the company said in a report released on its website.
About 8.5 million removals were from the United States, TikTok, owned by China’s ByteDance, added.
The company has put out transparency reports since 2019, after its platform that is massively popular among teenagers, came under scrutiny for content- and privacy-related issues which have also led some countries to ban the app.
TikTok, which has been beefing up its security and privacy features to retain users, opened a content moderation center at its Los Angeles office last year to boost transparency.
In its first disclosure on underage users, TikTok said it uses a variety of methods, including a safety moderation team, that monitors accounts where users are suspected of being untruthful about their age.
Those age 12 or younger are directed to “TikTok for Younger Users” in the United States.
TikTok, owned by China-based ByteDance, is believed to have some one billion users worldwide including more than 100 million in the United States.
Last month, the Biden adminstration reversed orders from former president Donald Trump which would have banned TikTok or forced its sale to American investors.
The report comes with social media operators facing increased pressure to remove abusive and hateful content while remaining open to a variety of viewpoints.
TikTok’s transparency report said that in addition to the suspected underage users, accounts from nearly four million users additional were deleted for violating the app’s guidelines.
“Our TikTok team of policy, operations, safety, and security experts work together to develop equitable policies that can be consistently enforced,” the report said.
“Our policies do take into account a diverse range of feedback we gather from external experts in digital safety and human rights, and we are mindful of the local cultures in the markets we serve.”
TikTok said its automated systems detect and remove the vast majority of offending content: “We identified and removed 91.3 percent before a user reported them, 81.8 percent before they received any views, and 93.1 percent within 24 hours of being posted.”
Overall, fewer than one percent of the videos uploaded on TikTok were taken down for violations.