Skip to content

Media Manipulation 101: What Is It and How Can You Spot It?

If it wasn’t already hard enough to spot the troll, disinformation campaigns are highly organized, employing bot networks, deep fakes, and sophisticated AI to boost the spread and evade detection. Learn more about media manipulation and how to spot it.

In early 2021, state media outlets in Russia and China circulated online news detailing vaccine safety concerns. These stories cited a correlation between Western-developed vaccines and patient deaths. 

The claims were false, but they had a clear objective: endorse Russian and Chinese vaccines and incite mistrust in Western vaccines and governments. 

New call-to-action

This is just one example of a top threat to modern democracies and public safety—media manipulation. According to the Oxford Internet Institute, social media manipulation campaigns increased by 150% between 2017 and 2019. Over 40% of people now believe that social media has enabled polarisation and foreign political meddling. We also know that some nation-states, like Russia, budget billions of dollars for disinformation efforts annually.

Governments, tech companies, and counter-disinformation teams rely on intelligence analysts to monitor media manipulation and its impacts across the web. This process requires a unique set of assessment skills and OSINT tools that make data collection and analysis simple and efficient. 

What is media manipulation and how can you spot it more effectively as an analyst?

What Is Media Manipulation?

News media manipulation online article on a smart phone.

Media manipulation is the act of shaping a story in favor of partisan interests—often by a foreign state. Manipulation tactics take a variety of forms, like distracting audiences from important stories or leveraging emotional reactions.

Disinformation (intentionally misleading) and misinformation (misleading, regardless of intent) are widespread manipulation tactics with significant political, economic, and social impacts. Social media allows disinformation actors (also called cyber troops) to easily spread manipulated media and target audiences through computational propaganda—the use of algorithms and automation.

According to Robert Evans, an investigative journalist at Bellingcat, we’ve seen disinformation fall into four categories since the onset of COVID-19:

  1. Generic disinformation, which manipulates the public perception of high-profile stories. This might look like an old photo or video that’s been reframed in a misleading new context.
  2. Disinformation from credible sources. This originates from valid media outlets and may intentionally or accidentally spread false information.
  3. Disinformation from unreliable sources. These sources spread false information but are designed to appear credible.
  4. Profiteers. This includes manipulated media that intends to generate profit—like advertising an unfounded COVID-19 treatment.

Media Manipulation: What's at Stake?

The European Parliament describes disinformation as verifiably false or misleading information that is “created, presented and disseminated for economic gain or to intentionally deceive the public, and may cause public harm.” These harms undermine:

  • Public health and safety. Preliminary research suggests that media manipulation influences citizens’ intent to vaccinate or pursue risky treatments. At a large scale, this has a significant impact on transmission rates and healthcare systems.

    Media manipulation also impacts public safety by co-opting social movements, shifting public opinion around global issues like climate change, and recruiting vulnerable individuals into terrorism. The European Parliament considers disinformation a human rights issue that violates privacy, democratic rights, and freedom of thought.
  • Political processes. Media manipulation has the power to build mistrust between populations and their governments, disrupt democratic processes, and exacerbate geopolitical tensions. Following the 2016 US presidential election, the US Department of Justice reported that the Russian Internet Research Agency purchased over 3,500 Facebook ads supporting Trump and operated a network of fake accounts posing as American activists. 
  • Financial security. Media manipulation takes a financial toll, costing an estimated $78B on the global economy each year. This includes the cost of reputation management, stock market hits, and countering disinformation.

The Struggle of Countering Media Manipulation

Big tech organizations like Facebook have publicized their commitment to combatting media manipulation on their networks. But documents leaked earlier in 2021 show that social media manipulation research and takedown efforts are lagging behind the spread. 

New call-to-action

According to the research organization RAND, most counter-disinformation techniques like those used by Facebook rely on a combination of human and machine analysis that leave detection gaps at scale. 

If it wasn’t already hard enough to spot the troll, disinformation campaigns are highly organized, employing bot networks, deep fakes, and sophisticated AI to boost the spread and evade detection. 

How to Spot Media Manipulation: 6 Guidelines

Identifying misleading content—a skill easier said than done—is crucial for countering the negative consequences of media manipulation. Here are some tips to hone your analysis:

How to Spot Media Manipulation 6 Guidlines (7)

Developing critical analysis skills is only part of the story. Because media manipulation is so widespread, analysts also need advanced tools to facilitate scaled monitoring. 

Open-source intelligence (OSINT) tools like Echosec help by making a variety of public sources easily searchable—not just mainstream outlets and social media sites. This includes alt-tech, fringe social media, and global sources relevant for assessing media in different regions, like Russia and China. 

Facebook’s disinformation battle is proof that using AI for counter-disinformation isn’t a perfect or complete solution. But alongside human analysis, investing in OSINT tools that use natural language processing can also alleviate overwhelmed intelligence teams.

Social media monitoring and how to gather intelligence from open source data   <https://cta-redirect.hubspot.com/cta/redirect/3409664/be8d8030-7477-4091-84b2-1570416bf170>

Countering the negative impacts of media manipulation starts with identifying and understanding its proliferation online. As technology struggles to keep up with manipulation tactics, analysts must develop critical assessment skills and invest in OSINT tools that support more comprehensive detection.

 

Need more support with media manipulation assessment?
Book a consultation with us and download a media verification handbook.

REQUEST A CONSULT