Solutions to TikTok's American Credibility Problem
This post was originally published on Interconnected.blog in both English and Chinese on June 26, 2022.
Last week, I wrote a detailed post breaking down TikTok’s immense credibility challenges in America, as revealed by Buzzfeed’s reporting based on more than 80 leaked internal meeting recordings.
In today’s post, I propose some viable solutions – some technical, some organizational – to each of TikTok’s four main credibility problems:
I conclude today’s post with some additional thoughts on TikTok’s own leverage in the US, which complicates how the app could be regulated under the Biden Administration. If you have not read last week's post yet, I highly suggest you do that first to build some context around each of these problems.
“Whistleblower” Problem: RBAC Bifurcation
With a full-blown “whistleblower” problem on ByteDance’s hand, there are only two ways for this problem to end – either ByteDance leadership earns the trust of its US employees or the leaking continues until there’s nothing left worth whistleblowing.
Obviously, from ByteDance’s perspective, it would prefer the first outcome. But how can it rebuild that elusive internal trust?
I propose a technical solution: RBAC bifurcation.
RBAC stands for Role-Based Access Control. It is a set of standard operational policies that describes and enforces what type of employee can access what data or information. Are you a junior-level software engineer? You can access this codebase but not others. Are you a manager of a data analytics team? You can access these tables in your data warehouse to do analytics but not other transactional databases. Are you a senior-level site reliability engineer? You can access these logs of user traffic but not the data warehouse that the manager of that data analytics team has access to.
Having a set of RBAC policies that constantly evolves as a company grows is industry best practice. I’m sure ByteDance’s vast global operation already has some level of RBAC in place. In order to rebuild internal trust, ByteDance needs to take its current RBAC a step further – completely bifurcate TikTok’s US operation’s RBAC implementation from ByteDance China.
Replicating then bifurcating RBAC between a China operation and a US operation is no easy feat, but it is very doable and a step that ByteDance can no longer avoid or delay. The bifurcated RBAC policies should also be made available for US regulators to audit. I advocated for this approach two years ago in a post called “Can ByteDane Build Trust”. Back then, I went as far as suggesting that ByteDance’s internal RBAC policies should be open sourced too! Open sourcing would do wonders for TikTok in its quest to build public trust, but much easier said than done (just look at Twitter’s journey to open source its algorithms – a GitHub repo that is now deleted). At the very least, ByteDance’s RBAC policies should be made auditable.
Without audited RBAC policies to back up TikTok’s public promises of protecting US user data, no well-meaning, well-crafted internal emails and all-hands meetings from ByteDance leadership can rebuild trust. Without that trust, no matter how quickly TikTok can hire people in the US, many of these employees will continue to churn, whistleblow, or do both.
“Master Admin” Problem: Empower Trust and Safety Team
Related to the RBAC bifurcation approach, the way to solve TikTok’s “Master Admin” problem lies in empowering the already-created but relatively-powerless US Trust and Safety team. This solution begins with giving this team the same “master admin” power and access privilege, then separating it with China team’s “master admin”. In this organizational setup, whenever a “master admin” based in China wants to access something – anything – in TikTok US, this person must gain permission from the US Trust and Safety team’s “master admin”.
Like I explained in last week’s post, there is nothing wrong with having “master admin” as a function or role. The problem with ByteDance’s current organizational setup is that this role sits in China and only in China. If this power is split between the US and China, where each side’s “master admin” is treated equally and serves as the “gatekeeper” of its own respective operation’s data, codebase, and all user information, then suspicious requests can be reviewed and rejected, while legitimate requests can still be granted.
There may very well be good reasons why a ByteDance China engineering team needs access to certain parts of TikTok US. The legitimacy of these requests can and should be determined by the US Trust and Safety team, in concert with its US leadership team, not China’s. Plain and simple. A reciprocal process should also be applied to any TikTok US engineering team that wants to access some parts of ByteDance China as well.
It is only fair.
“Protected Data” Problem: Regulators Publish Definitions and Reasoning
While both the “whistleblower” and “master admin” problems are issues that ByteDance, and ByteDance alone, can tackle, resolving the “protected data” problem is more in the hands of the US regulators.
It is incumbent upon these regulators to not water down how TikTok (and in effect, all Chinese tech companies) get regulated, especially the definition of “protected data”. When the “protected data” definitions are determined, these definitions should go through a public comment period (typical of any new Federal regulation), then published with the reasoning behind what type of data is and is not considered “protected” clearly explained.
As Buzzfeed’s reporting revealed, this definition is still “being negotiated”. Meanwhile, data types like UID (unique identifier) are still accessible. (Please see my post last week where I provide a technical primer of UID and why it could lead to user data access.) Whether UID should or should not be considered “protected data” in TikTok’s case requires knowledge of how TikTok’s internal database system is designed, architected and operationalized, which I, of course, do not have. But regardless of how UID (or any other data type) is treated, I, as an American citizen (and occasional TikTok user), deserve to know why.
This type of disclosure is not new or unprecedented. One prior art to reference is how personal healthcare data is regulated, protected, defined, and publicly explained under HIPAA (Health Insurance Portability and Accountability Act). In HIPAA, there is a concept called PII (personal identifiable information) that encompasses a sub-concept called PHI (personal health information), which altogether explains what kind of personal data and information is considered “protected” in the American healthcare system.
What is a PII or PHI is easily Google-able. The same clarity and transparency should exist when regulating TikTok.
“Oracle” Problem: Flow Log Audit
Interestingly, the “Oracle” problem can be solved by one of Oracle’s own products – VCN (Virtual Cloud Network) flow log. This product just became generally available in Oracle Cloud in early 2021.
A flow log is a common tool, not specific to Oracle Cloud. Every cloud platform, from AWS to AliCloud, has a similar tool. A flow log lets you view every connection that goes in and out of your cloud infrastructure, i.e. a log of the “flow” of your entire web traffic. And as Oracle’s own product announcement of its VCN flow log clear states, this feature serves two purposes:
Currently, Oracle Cloud is giving too much flexibility to how TikTok is using its cloud infrastructure, offering bare metal machines while TikTok designs its own software layer on top of these machines, thus its “Oracle” problem. In order to verify whether TikTok is complying with US regulations on Oracle Cloud, Oracle should make its TikTok-related flow logs available for auditing.
With flow log audit, there will be no need to guess whether TikTok allows access from China or not – every traffic request from anywhere will be plainly visible in TikTok’s flow log. In another post from two years ago, I specifically called out using flow log as part of a framework to “(dis)trust and verify” TikTok, back when TikTok was using a combination of AWS, GCP, and its own Virginia-based data centers as its cloud. The only difference now is that we know which cloud will have TikTok’s flow log going forward – Oracle Cloud.
TikTok’s Leverage
TikTok’s American credibility problems may be difficult, but they are by no means insurmountable, if ByteDance, Oracle, and the US government all act responsibly and in good faith. As an entrepreneur and operator, I don’t just like to talk about problems, but also possible solutions, and I hope I presented a few viable options here for all relevant parties to consider.
Whether it is RBAC or flow log, longtime readers of Interconnected will notice that many of these solutions are similar to the ones I wrote about in mid-2020. That’s because TikTok, despite its many promises to improve, has not changed. What has changed is that President Biden, not Trump, is now in charge.
Is that a good thing for regulating TikTok? I’m honestly not sure.
A recent issue of POLITICO's West Wing Playbook revealed that the Democratic Party has started its own official TikTok account, in order to appeal to young voters between the ages of 18-34 whose approval rating of Biden is low. Two rising stars of the Democratic Party, Stacey Abrams and Jon Ossoff, are both active on TikTok with large followings. During the early days of the Russian-Ukraine war, the Biden White House specifically gathered a group of 30 TikTok influencers – not YouTube, not Instagram – to brief them on the administration’s core message, so these influencers can help spread the word.
Like it or not, TikTok’s traction among American youth is strong, thus it has real leverage. As the political party that historically relies more on youth votes to win, the Democrats need TikTok, which puts the Biden administration in a tricky spot of having to both regulate TikTok and use it too!
Does this dynamic automatically mean the Biden administration will do a poor job of regulating TikTok?
No. But it does make an already complicated problem even more complicated.