Truth Decay in the Age of Algorithms | Peter Hughes
We are not short on information. We are short on signal. When platforms reward engagement, accuracy becomes optional. Emotion spreads faster than nuance. Repetition starts to feel like truth. Over time, the shared baseline of facts gets thinner and the temperature gets higher. Why this matters beyond politics I have seen the same dynamic inside organisations. A confident assumption becomes a plan. A plan becomes a dependency. A dependency becomes an expensive surprise. Bad inputs do not stay small for long. Truth decay is not a single event. It is a slow drift where opinion looks like fact, and where we stop checking because it feels like somebody else already did. The algorithm effect Most feeds are tuned for time on platform. That means they favour content that triggers a reaction. Outrage, fear, certainty, tribal language. Those patterns keep attention, even when the underlying claim is weak. The result is a feedback loop. You see more of what you already agree with. It feels like clarity, but it can be a narrowing lens. My working rules - Find the source. If I cannot find the original, I treat it as untrusted. - Look for two independent confirmations. One source can be wrong. Two can still be wrong, but it is a better start. - Slow down when something makes me angry. Emotion is a useful signal. It is also a common attack surface. - Be comfortable saying I do not know yet. Certainty is not a requirement. Integrity is. Small habit, big effect If you want better outcomes, protect the quality of the inputs. That is true in systems. It is true in teams. It is true in public discourse. One careful share does not fix the internet. But a culture of careful people fixes more than you think.
peter.hughes.team