The Study About Dis-Information You Should Read

“Information Disorder” is one of the best studies to read if you want to get involved in the fight for trustable content. Written by Claire Wardle and Hossein Derakshan, two experts in the field, the study provides a structured overview. This helps to first understand the problem of false and falsified information. The study goes beyond simply describing the problem.  Instead, it provides a “framework for policy-makers, legislators, researchers, technologists and practitioners” who need to work together for results. 


A need for collaboration


The report acknowledges that the phenomena are not new. False rumours and campaigns using false information existed before the internet. What has changed recently is the scale of “information pollution”. False information can spread faster, making it more complex to provide correct facts. “Information Disorder” suggests that there is an urgent need to work collaboratively to find solutions. To get there the report provides a detailed framework of possible actions. 


As a helpful overview, the report visualises the three key forms of information disorder, side by side. 


Information disorder

From just false to outright harmful: Forms of information disorder. Source: Information Disorder, 2017.

  • Mis-information is when false information is shared, but no harm is meant.
  • Dis-information is when false information is knowingly shared to cause harm.
  • Mal-information is when genuine information is shared to cause harm, often by moving information designed to stay private in the public sphere.

The study suggests clearly to not use the label “fake news”, with reason:

“In this report, we refrain from using the term ‘fake news’, for two reasons. First, it is woefully inadequate to describe the complex phenomena of information pollution. The term has also begun to be appropriated by politicians around the world to describe news organisations whose coverage they find disagreeable. In this way, it’s becoming a mechanism by which the powerful can clamp down upon, restrict, undermine and circumvent the free press.”

Information Disorder, 2017


Why it matters: If we want to curb the amount of wrong information we must understand first that an unintended communication mistake has to be separated from a campaign using entirely fabricated info. This applies to both technical tools against such information as well other forms of activism in this space.

The study asks the key questions: Who is creating disinformation? Why are they doing it and what are their goals? This is extremely important for any initiative in this field: Only when we understand why some groups use this instrument we have a real chance to block, filter or expose intentional mis- and mal-information. 

As the study comes from two authors with a writing/journalism background they take a look at how and why people consume such content and why – at times – they start to belief in what is presented them and even re-distribute the content. Why is this? “A key argument within this report, which draws from the work of the scholar James Carey, is that we need to understand the ritualistic function of communication. Rather than simply thinking about communication as the transmission of information from one person to another, we must recognize that communication plays a fundamental role in representing shared beliefs. It is not just information, but drama”

The role of social platforms: “Information pollution” can affect all complex topics – health and medical, economy and ecology. As such the occurrence of false information and the use as a tool is not new. But a difference now is that social media platforms amplify the reach and encourage users to post material which might earn them approval from others – through likes and other subtle forms of positive feedback.

“It is not just information, but drama”


Step by step the authors develop a framework of recommendations and actions for multiple interest groups: Administrations, public bodies, governments, the civil society and – of course – media organisations. What makes the study helpful is that these lists of tips are straight-forward and simple. In combination, one could hope though that the numerous counter-activities help to fight information disorder, over time.


Download the 2018 version of the report here: Information Disorder (free, no registration)

Disinformation that kills (Study)

Disinformation that kills (Study)

Disinformation That Kills: The Expanding Battlefield Of Digital Warfare – Analysis

How do disinformation campaigns work and what are the different types of attacks? Social media has turned into a digital battlefield. Because so many people are online building a false narrative to influence a society has become one facet the information warfare. “False information about major events from the Covid-19 outbreak to the 2020 US election is jeopardizing public health and safety.” The report (free download after registration) provides helpful information about the current situation. Further the authors provide an outlook how misinformation will most likely change in the future.

CB Insights



Truth and Trust Online Conference 2020

Truth and Trust Online Conference 2020

When: October 16th & 17th
Where: The Internet (virtual event)

Why it matters:
The Conference for Truth and Trust Online (TTO 2020) assembles stakeholders that care about “improving the truthfulness and trustworthiness of online communications”. This includes media scholars, journalists, activists, or industry representatives. Talks and sessions deal with all kinds of social media and information problems – and potential countermeasures.

Who is there?
Trust in online media/news is one of our core topics in many research projects. Especially when it comes to the narrower field of verification, fact checking, and countering false information.

At TTO 2020, Ruben Bouwmeesterand Patrick Aichroth (Fraunhofer IDMT)  will discuss Deepfakes detection, or more specifically: the ongoing Google DNI project Digger, which is about detecting manipulated and synthetic media with the help of audio forensics technology. The talk will have a focus on “the challenges of transferring scientific expertise into the media domain, making it intelligible and usable to non-forensics experts, and the importance of adopting a ‘falsification culture’ for media”. 

To learn more about the event and register as a participant, go to

Tim Berners Lee’s startup Inrupt releases Solid privacy platform for enterprise

Tim Berners Lee’s startup Inrupt releases Solid privacy platform for enterprise

“Changing the way people connect with their data changes everything”, this is the opening statement on on the homepage of Inrupt, a startup founded by web inventor Tim Berners Lee. Three years after being founded Inrupt introduced a new platform for enterprise platforms, enabling development of offerings with a high level of privacy and control.

The new offering is an extension of Solid, “a technology for organizing data, applications, and identities on the web. Solid enables richer choices for people, organizations and app developers by building on existing web standards.” The software is open source and aims to offer standards how user data is handled, with more transparency and – ultimately – trust. 

Users in control of their data

A key concept of Solid is that data of a user is stored in a secure Personal Online Data Store (PODS). The concept is still in early stages, currently overwhelmed by financial interests and many users simply accepting data tracking. What many hope is that initiatives and new offerings might lead to a threshold where user privacy will become the standard.

One step in that direction is the newly released enterprise software from Inrupt. It enables developers to build applications based on Solid. To provide an example: Inrupt currently works with the National Health Care Service in the United Kingdom. Once implemented, users of the NHS will be able to decide who has access to health data and records, from family to doctors to an insurance company. The key point is that the exchange is transparent and controllable for both sides, specifically the user.

Ideally this much better controlled and transparent exchange of data will open the path to fully trustable platforms.


NHS data: Can web creator Sir tim Berners-Lee fix it?, BBC (November 9, 2020)

Photo by Jason Richard on Unsplash