Updated News Around the World

TikTok’s Secret Sauce Poses Challenge for U.S. Oversight, Researchers Say

WASHINGTON—Potential Chinese influence over what Americans view on TikTok is an increasing concern among U.S. policy makers—and one that could defy a simple solution.

TikTok says it has devised a plan that would allow U.S. officials to check whether its proprietary algorithm—the secret sauce of computer code that recommends videos to users—was being influenced by Beijing.  

That arrangement faces practical difficulties, according to some U.S. officials and independent researchers, who say that determining why the algorithm promotes or demotes content is a complex task that often doesn’t yield clear answers.

“It is, technically, really hard,” said Sen.

Mark Warner

(D., Va.), chair of the Senate Intelligence Committee. He said that U.S. overseers might be able to review the core TikTok algorithm, but that much of the app’s code would still be written in Beijing. 

Chinese ownership of the hugely popular video app worries U.S. government officials, who say the Chinese government could potentially manipulate what viewers see—or access data on U.S. users.  

TikTok’s promise of transparency into its algorithm is a central part of the company’s proposal to fend off rising calls to ban the app. 

TikTok, owned by Beijing-based ByteDance Ltd., says it wouldn’t allow Chinese government meddling but has offered to give a U.S. company,

Oracle Corp.

, the ability to access the algorithm’s code and flag issues for government inspectors—part of a $1.5 billion plan to wall off aspects of its U.S. operations. 

A TikTok spokeswoman said the company’s proposal “includes layers of government and independent oversight to ensure that there are no backdoors into TikTok that could be used to access data or manipulate content on the platform.”

Social-media researchers said that even if TikTok’s overseers could keep up with constant changes to the code, they would also need detailed information about how the algorithm filtered and rated the app’s fire hose of videos to ensure no interference. 

“TikTok could open-source every line of its code and the impact that the algorithm has on what is getting shown won’t be plainly visible,” said

Cameron Hickey,

CEO of the National Conference on Citizenship, a nonprofit that studies how social media affects civic discourse. 

Under TikTok’s proposal, Oracle would need to sign off on all of TikTok’s code before it can operate, the TikTok spokeswoman said. Oracle would also monitor whether the code operates as expected and be empowered to access any other information needed to assess the platform’s integrity, she said. 

TikTok has offered to give Oracle Corp. the ability to access its algorithm’s code and flag issues for government inspectors.



Photo:

Justin Sullivan/Getty Images

Political content on TikTok is one of U.S. officials’ primary national-security concerns. Mr. Warner and others say Beijing could attempt to shape what Americans see on TikTok with the goal of sowing discord, influencing public opinion or tilting an election. 

Part of the challenge in addressing those concerns is that determining why content spreads or doesn’t spread on TikTok isn’t an exact science. Some TikTok users have long suspected that the app’s algorithm disfavors content about political topics. 

Corrbette Pasko,

whose eponymous account giving satirical news updates from Generation X has more than 271,000 followers, said she regularly modifies the captions or text accompanying her videos to hide political themes. 

“I spell ‘abortion’ with an umlaut,” she said. 

Ahead of the U.S. midterm elections and subsequent Senate runoff in Georgia last fall, Ms. Pasko participated in an experiment with 21 other popular TikTok personalities, known as influencers. 

Each influencer posted two nearly identical videos to the TikTok app. Both videos encouraged viewers to vote, without a partisan tilt. But in one set of posts, the influencers spoke or typed in words related to voting, such as “#elections” or “#govote.” In the second set, the influencers didn’t type in any voting-related text.  

The second set of videos garnered more views in all but three cases, and more than twice as many views in total, according to Accelerate Change, the nonprofit group that organized the experiment. 

TikTok is at a crossroads, as U.S. concerns about its Chinese ownership grow. Some officials have explored the idea of forcing a sale to a U.S. company. WSJ explains the challenges of making that happen. Illustration: Preston Jessee

Accelerate Change works to increase civic engagement among minority groups, sometimes in line with progressive causes. It disclosed receiving less than 1% of its funding during the past five years from TikTok competitor

Meta Platforms Inc.

for work on separate projects. 

Peter Murray,

president of Accelerate Change, said the results show that TikTok is consistently suppressing videos when it can detect they are about voting. The group also tried a parallel experiment on YouTube and Instagram, but didn’t see a big difference in engagement. 

“We’ve never seen a social-media property do a blanket suppression of voter content across its platform,” Mr. Murray said. “This is only happening on TikTok.” 

Some other social-media researchers said that the experiment raised questions about TikTok’s treatment of political content, but didn’t prove TikTok intentionally suppressed voting-related videos. 

The results could be explained by subtler factors, they said, such as the number of words appearing on screen or the characteristics of the 22 influencers and their audiences, which collectively represent a tiny slice of TikTok’s massive platform. 

Josephine Lukito,

an assistant professor at the University of Texas at Austin who has studied online disinformation, said U.S. officials would face similar challenges trying to analyze the algorithm. 

“Even if Oracle did a review and found some biases,” she said, “it may be difficult to attribute it to intentional manipulation.” 

SHARE YOUR THOUGHTS

What’s the best way for the U.S. to oversee TikTok? Join the conversation below.

The TikTok spokeswoman said the experiment doesn’t support Accelerate Change’s conclusions, noting that the group didn’t test whether a similar effect would occur with videos unrelated to voting. She said that TikTok doesn’t categorize or recommend videos based on any subject, including politics. Rather, the recommendations are based on users’ actions, she said. 

To communicate more with users about why the app might not share their videos, TikTok said last week that it plans to test a section of the app where users can see if TikTok deemed their video “ineligible for recommendation.” 

TikTok’s own executives have sometimes struggled to understand the algorithm’s performance. In the summer of 2020, some U.S.-based TikTok executives noticed that views of certain creators were dropping mysteriously—only to find later that a team in China had made changes to the algorithm, The Wall Street Journal has reported, citing people familiar with the matter. 

Some of the posts involved calls for people to vote, the people said. 

John D. McKinnon contributed to this article.

Write to Ryan Tracy at [email protected] and Georgia Wells at [email protected]

Copyright ©2022 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsUpdate is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.