Today we are disclosing 32,242 accounts to our archive of state-linked information operations — the only one of its kind in the industry. The account sets we’re publishing to the archive today include three distinct operations that we have attributed to the People's Republic of China (PRC), Russia, and Turkey respectively. Every account and piece of content associated with these operations has been permanently removed from the service. In addition, we have shared relevant data from this disclosure with two leading research partners: Australian Strategic Policy Institute (ASPI) and Stanford Internet Observatory (SIO).
In all instances, accounts were suspended for various violations of our platform manipulation policies.
People’s Republic of China (PRC)
While this network is new, the technical links we used to identify the activity and attribute it to the PRC remain consistent with activity we initially identified and disclosed in August 2019. Our proactive removal of this network from Twitter is a direct result of the technical efforts we instituted after thoroughly studying and investigating past coordinated information operations from the PRC.
Today’s PRC disclosure relates to two interconnected sets of accounts:
- 23,750 accounts that comprise the core of the network, e.g. the highly engaged core network.
- Approximately 150,000 accounts that were designed to boost this content, e.g. the amplifiers. Based on feedback from researchers on our prior disclosures that we need to better refine the disclosure process to enable efficient investigation of the core activity, we have not included the 150,000 amplifier accounts in the public archive.
Despite the volume, the core 23,750 accounts we are publishing to the archive were largely caught early and failed to achieve considerable traction on the service, typically holding low follower accounts and low engagement. Of the approximately 150,000 amplifier accounts, the majority had little to no follower counts either and were strategically designed to artificially inflate impression metrics and engage with the core accounts.
In general, this entire network was involved in a range of manipulative and coordinated activities. They were Tweeting predominantly in Chinese languages and spreading geopolitical narratives favorable to the Communist Party of China (CCP), while continuing to push deceptive narratives about the political dynamics in Hong Kong.
Russia
Aided in part by useful information sharing from external researchers and our peer companies, we investigated accounts associated with Current Policy, a media website engaging in state-backed political propaganda within Russia. A network of accounts related to this media operation was suspended for violations of our platform manipulation policy, specifically cross-posting and amplifying content in an inauthentic, coordinated manner for political ends. Activities included promoting the United Russia party and attacking political dissidents. We’re disclosing all 1,152 accounts and associated media to our public archive today.
Turkey
Detected in early 2020, this network of accounts was employing coordinated inauthentic activity, which was primarily targeted at domestic audiences within Turkey. Based on our analysis of the network’s technical indicators and account behaviors, the collection of fake and compromised accounts was being used to amplify political narratives favorable to the AK Parti, and demonstrated strong support for President Erdogan. We’re disclosing 7,340 accounts to the archive today.
Technical signals point to the network being associated with the youth wing of the party and a centralized network that maintained a significant number of compromised accounts. As a result, the network we’re disclosing today includes several compromised accounts associated with organizations critical of President Erdogan and the Turkish Government. These compromised accounts have been repeated targets of account hacking and takeover efforts by the state actors identified above. The broader network was also used for commercial activities, such as cryptocurrency-related spam.
Next steps for this work
Ultimately our goal is to serve the public conversation, remove bad faith actors, and to advance public understanding of these critical topics. In the future we’re going to:
- Offer more clarity in the public archive around impression counts and attempt to further measure the tangible impact of information operations on the public conversation.
- Continue to formalize our academic partnerships to ensure they’re globally diverse and advancing public understanding of these issues.
- Host an online conference later in the summer to bring experts, industry, and government together to discuss opportunities for further collaboration.
Our Site Integrity efforts represent some of the most critical work we do at Twitter to protect the public conversation. For a detailed breakdown of how we conduct these investigations, the operational principles that inform them, and our approach to transparency, see here.
Keep on top of regular updates from us over at @TwitterSafety.