Techniques and Tutorials

How Analysts Spot Coordinated Influence Operations

 As open-source intelligence (OSINT) practitioners know all too well, the modern information landscape is rife with coordinated attempts to manipulate public discourse through covert influence operations.

From state-sponsored disinformation campaigns to subtle social media narratives, these sophisticated efforts to shape perceptions and sow confusion threaten the integrity of democratic processes.



But OSINT analysts armed with the right tools and techniques can detect the digital fingerprints of coordinated inauthenticity. By understanding the common tactics, signals, and behavioural patterns of influence operators, specialists can expose coordinated deception campaigns and guard against their corrosive effects.


Recognising Network Coordination

Coordinated influence operations rarely manifest through isolated, independent personas. Instead, they deploy interconnected networks of accounts designed to create an illusion of organic, widespread support.

OSINT analysts can uncover these networks by mapping the associations between accounts – tracing shared profile details, overlapping followers/followees, identical posted content, and other relational signals. Link analysis tools visualize these connections, revealing coordination hubs and associated clusters.

Analysts also watch for synchronized activity patterns, such as accounts posting identical messages in rapid succession, accounts rapidly accumulating followership, or large numbers of accounts amplifying the same narratives simultaneously. Spikes in these metrics can indicate coordinated “bot” or “cyborg” (human-bot hybrid) behaviours rather than natural user activity.


Identifying Inauthentic Account Behaviour

Beyond network linkages, individual account behaviours often betray inauthenticity. Influence operators commonly deploy fake personas with tell-tale signs:

  • Accounts with stock profile photos, generic bios, and minimal content history.
  • Profiles impersonating real people using stolen identities.
  • Accounts exhibiting sudden spikes in activity after long periods of dormancy.
  • Accounts demonstrating repetitive, automated posting patterns.
  • Accounts with discrepancies between claimed persona and actual activity.

Analysts can leverage automated monitoring tools to flag these anomalies in real-time, enabling swift detection of newly created fake accounts seeded into online conversations.


Analysing Language and Content Patterns

Even when account details appear credible, the language and content produced by coordinated influence networks often exhibits distinct patterns. Analysts look for:

  • Identical or highly similar phrasing across multiple accounts, suggesting scripted messaging.
  • Consistent grammatical/spelling errors hinting at non-native language usage.
  • Sudden shifts in sentiment, ideology or topic focus uncharacteristic of authentic users.
  • Content that appears algorithmically generated rather than organically composed.

Stylometric analysis techniques can statistically model linguistic ‘tells’ to attribute content to common origin points. Comparing message fingerprints across accounts allows mapping of larger propaganda networks.


Tracing Funding and Organisational Links

Deeper investigation of the funding sources, organisational structures, and operational tradecraft behind influence campaigns can further expose their coordinated nature.

OSINT researchers might uncover shared web infrastructure, overlapping financial ties, or common personnel across ostensibly independent online personas and websites. Tracking domain registrations, incorporation records, financial flows, and organizational affiliations can illuminate hidden coordination.

Additionally, identifying the ultimate sponsors or handlers controlling these networks – whether state intelligence agencies, political operatives, or commercial PR firms – provides critical context around their strategic objectives and execution.


Corroborating with Third-Party Reporting

While OSINT holds enormous power for exposing coordinated manipulation, analysts should not operate in isolation. Triangulating findings against authoritative third-party reporting and expert analysis strengthens the credibility of investigations.

Established media outlets, academic institutions, think tanks, and charities or other NGO’s often conduct their own rigorous investigations into influence operations. Aligning OSINT discoveries with these external sources may protect analysts against potential biases or blind spots.

Collaboration with these trusted partners also allows aggregating disparate data points into comprehensive narratives that holistically untangle the full scope and mechanisms of coordinated deception campaigns outside of the analyst’s sphere of knowledge or interest.


Legal and Ethical Considerations

As OSINT capabilities for countering influence operations advance, analysts must remain vigilant about operating within legal and ethical boundaries. Overzealous techniques that veer into unauthorised surveillance, data theft, or other illicit means threaten to undermine the legitimacy of findings.

Clear documentation of methodologies, transparency around data sources, and adherence to consent-based access are essential. Knowledge of the laws under which you and/or your organisation operate should always be understood and followed and the consequences of a breach of them should direct your activity.

Equally important is maintaining an unwavering commitment to objectivity. Allowing personal biases or political agendas to colour analysis opens the door to accusations of partisan skewing, discrediting your hard work. Rigorous structured analytic techniques help safeguard impartiality.


Further Reading:

  1. Computational Propaganda: Political Parties, Politicians, and Political Manipulation on Social Media by Samuel Woolley and Philip N. Howard: This book examines how political parties, individual politicians, and extremist groups use social media to shape public opinion and influence political outcomes. The term “computational propaganda” refers to the use of algorithms, automation, and human curation to deliberately spread misleading information across social networks. Woolley and Howard explore various global case studies to illustrate the strategies and impacts of these campaigns, shedding light on the challenges they pose to democratic processes and how they can be addressed.
  2. The Geopolitics of Information: How Western Culture Dominates the World by Armand Mattelart: Armand Mattelart’s work is a critical examination of how Western nations, primarily the United States, have exerted cultural dominance through the strategic dissemination of information and media content. The book discusses the historical and contemporary mechanisms of cultural imperialism that shape global perceptions, values, and power dynamics. Mattelart argues that the control over information flows has significant geopolitical implications, influencing everything from economic policies to political regimes around the world.
  3. Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics by Yochai Benkler, Robert Faris, and Hal Roberts: This book focuses on the American media landscape and how it facilitates the spread of misinformation and disinformation, contributing to political polarization and radicalization. The authors use data analytics and network analysis to trace the sources and flow of news stories, debunking the notion of a symmetric partisan divide and instead highlighting the asymmetric nature of how misinformation spreads through conservative media networks. Benkler, Faris, and Roberts also discuss the role of major tech platforms in amplifying these effects and offer insights into potential reforms to mitigate the impacts of network propaganda on democratic discourse.



Open-source intelligence plays a vital role in unmasking the coordinated deception campaigns that threaten the integrity of our information ecosystem. By understanding the common tactics, signals, and behavioural patterns of influence operators, OSINT analysts can detect the digital fingerprints of inauthenticity and expose the true orchestrators behind manipulative narratives.