Invisible Rulers – The People Who Turn Lies into Reality



A recent Vox article discusses a new book by Renée DiResta, “Invisible Rulers,” which examines and contextualizes how bad information became so powerful and popular online. “She charts how the “collision of the rumor mill and the propaganda machine” on social media helped to form a trinity of influencer, algorithm, and crowd that work symbiotically to catapult pseudo-events, Twitter Main Characters, and conspiracy theories that have captured attention and shattered consensus and trust.”

The majority of online misinformation is believed to be virally spread by influencers on platforms like TikTok or QAnon. “The influencers, along with TikTok, made money off the sale of this misleading book. I brought all this to the attention of TikTok. The videos I flagged to a company spokesperson were removed after a review for violating TikTok’s policies banning health misinformation,” writes the Vox.

By revealing the machinery and dynamics of the interplay between influencers, algorithms, and online crowds, DiResta vividly illustrates the way propagandists deliberately undermine belief in the fundamental legitimacy of institutions that make society work. This alternate system for shaping public opinion, unexamined until now, is rewriting the relationship between the people and their government in profound ways. It has become a force so shockingly effective that its destructive power seems limitless. Scientific proof is powerless in front of it. Democratic validity is bulldozed by it. Leaders are humiliated by it. But they need not be.
With its deep insight into the power of propagandists to drive online crowds into battle—while bearing no responsibility for the consequences—Invisible Rulers not only predicts those consequences but offers ways for leaders to rapidly adapt and fight back.

– From the book cover

Renee DiResta’s book, which I have not read, is described as exploring the complexities of online misinformation, combining historical context, analysis, and personal anecdotes. In the book, the author offers insights gained from years of studying disinformation and manipulation. She argues that simple blame games, such as attributing misinformation to “Russian bots,” oversimplify the issue, as its underlying causes are multifaceted, involving people’s choices and collective conduct.

DiResta says that solely targeting algorithms or content moderation won’t solve the problem because virality is the result of collective actions influenced by user data.