TikTok, the AEC, and the problem no one’s talking about
The AEC's move to join TikTok acknowledges the platform’s influence, but it overlooks the deeper risk of algorithmic manipulation and covert political interference.
Last week, I was quoted in The Australian, discussing the Australian Electoral Commission’s (AEC) decision to join TikTok. On balance, I’m not against the move—but it does raise serious questions. While the AEC’s presence on the platform may help counter misinformation, it also legitimizes a social media ecosystem that is ultimately subject to the influence of an authoritarian government.
The AEC’s decision is, whether explicitly stated or not, an acknowledgment of TikTok’s immense political influence. Millions of Australians use the platform not just for entertainment but as a primary source of news and political discourse. Given this reality, the AEC’s presence on TikTok makes a certain kind of sense—they want to be where the voters are.
The AEC has sought to mitigate security concerns by limiting access to TikTok on a dedicated device, among other measures. These steps are a sensible way to manage risks related to data exfiltration. But focusing solely on cybersecurity misses the bigger issue: TikTok isn’t just a potential conduit for data collection; it is a tool for content manipulation.
As I’ve argued ad nauseum for years now, including to the Senate Select Committee on Foreign Interference Through Social Media in 2023, the greater concern isn’t what TikTok takes—it’s what it gives. The platform’s algorithm could subtly skew video recommendations in a way that aligns with the Chinese Communist Party’s (CCP) strategic interests. This is a threat that grows as more people rely on TikTok for news and information. The CCP’s ability to curate, suppress, or amplify content behind the scenes remains a vastly under-discussed risk.
The AEC’s decision to join TikTok follows a broader trend: many Australian politicians have also embraced the platform. And each time a politician is asked about their decision to use TikTok, they rely on the same justification the AEC has now adopted—they stress that they use the app on a separate device, with one MP even admitting to using a special “red phone” for TikTok.
But these precautions, along with the rule that TikTok cannot be installed on government phones, are little more than a fig leaf. While such measures are necessary, they primarily serve to signal that something is being done—without addressing the far more consequential risk of algorithmic content manipulation. These politicians acknowledge the security concerns by taking steps to isolate TikTok on separate devices, yet they conveniently ignore the broader, more abstract problem: the ability of the platform to shape political discourse in ways users may never perceive.
This is why I’m wary of framing the TikTok debate as a matter of personal responsibility. At the individual level, it’s difficult to grasp the risk of subtle algorithmic influence. But when you look at TikTok’s reach at scale—millions of Australians consuming news and engaging in political discussion on the platform—the deeper issue becomes clear. The real concern isn’t just who collects the data; it’s who controls the narrative.
The AEC’s move to join TikTok may be understandable from a voter engagement perspective, but it’s crucial to recognize that the security concerns extend far beyond protecting government devices. A more serious conversation is needed about how platforms like TikTok can quietly shape public opinion in ways that we may never fully see.