The app for independent voices

Canada has followed the US and EU in allowing TikTok to operate, subject to data security conditions. The trouble is, these measures don’t appear to protect society from the platform’s more subtle propaganda and influence risks. 

canada.ca/en/innovation…

Background: After an initial review, in 2024, the Liberal government said TikTok’s offices in Canada were “injurious to national security” and ordered them to close, although Canadians could still use the app. I’m told that was the most the government was able to do, and it was meant as a signal to the public that experts thought the platform was dangerous. TikTok brought a legal challenge to the order and kept its Toronto office open, and in January, a federal court overturned the shutdown. globalnews.ca/news/1172…

As UOttawa’s Michael Geist has explained, the government had no choice but to change tack once the Trump administration cut a deal with ByteDance that left Canada stuck with “a corporate ban that created real harms with no discernible benefit and a Canadian TikTok app that would ultimately offer fewer safeguards than the U.S. equivalent.” Then, after PM Carney visited China, the government agreed to set aside its shutdown order and negotiated new terms with TikTok.

michaelgeist.ca/2026/01…

But the new compromise still seems to leave a hole in defence against foreign influence. One reason is that current propaganda monitoring tools focus on disinformation and ideology. But new research by Michael Morgan suggests that while TikTok may not tell you what to think or believe, it may condition how you FEEL before you think. Users’ emotional orientation shifts measurably on the platform even when declared political beliefs stay stable.

That could serve adversaries’ propaganda objectives by weakening and dividing society. Seed enough outrage, grief, or moral indignation through algorithmically amplified content and the audience is pre-conditioned for subsequent persuasion. No false claims, no coordinated messaging, just intensified sentiments.

To function effectively in a liberal democracy, political deliberation needs informed citizens who encounter and evaluate events, policies and statements on relatively neutral ground. Morgan’s research suggests that even before that happens, people are emotionally sorted by their feeds into communities of shared outrage or sympathy they didn’t consciously choose. Seems to me most social media do that, but TikTok’s architecture makes it particularly potent.

At scale, that could make populations angrier, more morally certain, and less capable of tolerating the ambiguity and compromise democratic governance requires. That could shift the affective terrain of emotions, attitudes and values, and so contribute to polarization that makes open societies harder to govern and free nations less able to resist authoritarian rivals.

Bottom line: Canada and other governments have focused on regulating data security while leaving vectors for influence unimpeded. Ensuring that user data isn’t being read by Chinese employees does nothing to stop TikTok’s recommendation engine from cultivating emotional orientations that shape how users interpret political events. That’s a potential threat that merits rigorous monitoring and analysis. What should sensible citizens do in the meantime? Don't use TikTok, for starters. More in the comments.

Mar 10
at
4:53 PM
Relevant people

Log in or sign up

Join the most interesting and insightful discussions.