Google has published its latest TAG Bulletin report, which provides an overview of all of the coordinated influence operations that its team detected and shut down across its apps in Q1 2022.
And for the most part, it looks pretty straightforward – 3 YouTube channels shut down in relation to efforts to criticize Sudanese president Omar al-Bashir, an AdSense account linked to influence terminate operations in Turkey, 42 YouTube channels and 2 Ads accountsd as part of an investigation into coordinated influence operations linked to Iraq.
But then there’s this:
“We terminated 4361 YouTube channels as part of our ongoing investigation into coordinated influence operations linked to China. These channels mostly uploaded spammy content in Chinese about music, entertainment, and lifestyle. A very small subset uploaded content in Chinese and English about China and US foreign affairs. These findings are consistent with our previous reports.”
That seems like a lot right? 4300 YouTube channels – not just individual videos – in a single month, is a lot of content.
But as YouTube notes, that’s actually in line with previous TAG reports.
Going back over its most recent TAG updates, Google removed:
- 5,460 YouTube channels in December, also linked to coordinated influence operations linked to China
- 15,368 Chinese YouTube channels in November
- 3,311 YouTube channels in October
- 1,217 in September
- 1,196 in August
- 850 in July
All of these are connected to the same investigation into Coordinated influence operations linked to China, and all of them have the same description as the one above, relating to “spammy content” around entertainment, with some notes on US/China affairs. That’s over 31,000 YouTube channels removed over the last seven months.
So what’s going on? What, exactly, are these Chinese influence operations looking to achieve, and are they gaining any traction through this broad-reaching YouTube push?
It seems, based on Google’s description, that the main purpose of this effort is to first build an audience in the app with engaging, light content, before then using that reach to sprinkle in some pro-China sentiment, in order to seed it among broader audiences.
That then enables the CCP, and/or related groups, to potentially sway public opinion through subtle means, by gently nudging these viewers towards a more positive view of China’s activities.
That’s generally been China’s MO with its information operations – on Q and A platform Quora, for example, there are many examples of people asking questions about China, only to see glowingly positive replies from users.
It seems that’s the modus operandi here too, with Chinese-originated groups seeking to build audiences on YouTube to then establish distribution and dissemination chains for pro-China propaganda.
But it’s certainly a significant push, and it’s interesting to note just how much China is ramping up its activity over time. That likely suggests that it sees YouTube as a powerful vector for influence, which further underlines the importance of social platforms taking proactive, definitive steps to stop such programs before they can gain traction.
We’ve asked Google for more info on the specifics of these banned channels, and we’ll update this post if and when we hear back.
You can read Google’s latest TAG report here.