Inspired by our current ‘post-fact democracy’, I had the thought that maybe the internet isn’t the democratic safeguard that people believed it to be in the 90s and 00s.

Before the internet, information travelled from person to person. This meant that each person was a firewall, in a way: I get to decide what information I give on and what information I ignore. Depending on how stubborn I am, worst case I might stick to my misinformed opinion even though everyone around me is correctly informed.

But with the advent of the internet, there’s now the ability to actively look for information. I can google for anything I want and odds are I will be presented with the information I’m looking for. This way, misinformed individuals can get enough of a clout to sustain their believes, and in turn misinforming others.

The simulation below tries to show both these scenarios. The square represents individuals, 255 individuals for each pixel. Each individual might be uninformed (black), misinformed (red), or correctly informed (green), and the color of the pixel represents to what extent the individuals are informed.

When the simulation starts, the (correct) information spreads to each neighbor, from the middle outward. To spread incorrect information (rumours), click anywhere on the canvas.

The “My bias” value controls how much each individual values his/her opinion over their neighbors’. A value of ‘1’ means an individual values his/her own opinion as much as any other’s.

The “ratio” value controls the odds of being presented with correct information. Obviously, if the odds of finding correct information is lower than 50%, there’s no chance the truth can prevail. However, it’s interesting to see that even at a ratio of 0.51, misinformation completely disappears if we change the bias value to 0: extreme open mindedness, where my neighbors can override my wrong opinion.

Use a large bias to simulate “social media”, where my own opinion is being amplified. For example, a bias of 10 can be understood to mean that in addition to the 8 neighbors (neighboring pixels) and my own opinion, an individual can find another 9 opinions similar to his/her own, likely overriding the opinion of the neighborhood. With a large bias, the ratio between informed and misinformed approaches 50/50, eventhough the incorrect information is scarcer!

Informed: ? (?%)
Misinformed: ? (?%)
Uninformed: ?