Brains, Lies, & Electric Shocks [RR]
![Brains, Lies, & Electric Shocks [RR]](/content/images/size/w1200/2025/10/Synched.png)
?
Interbrain Synchrony & Political Extremism
Ever feel like people at the political extremes are singing from the same hymn sheet, even when the lyrics are completely different? It turns out, you might be correct..neurologically.
Neuroimaging the brains of people with extreme ideologies unsurprisingly shows that viewing political content heightens activity in their brain's emotional centers, regardless of ideology. Their political responses are, in part, a gut reaction, not a purely reasoned one.
The surprising part is that the brains of people with shared extremityâeven if they held opposing political viewsââexhibit increased neural synchronizationâ when viewing political content. A far-left and a far-right brain sync up around shared arousal. (Could we treat them with anti-addiction medication?)
Ideological extremism may not be a coherent set of beliefs as much as a shared brain state. The primary connection is affective, not semantic. It's not about what you think but how you feel. It also helps explain why trying to fact-check an extremist is like trying to critique the lyrics of a song to someone who is just there for the bass drop.
The Timbre of Lies
Ever feel like people who believe the same story are on the same wavelength? Their brains might just be.
When people believe a story together, their brains start to synchronize, firing in similar patterns âwithin the default mode networkâ. And when people disbelieve a story together, their brains also synchronize, but in completely different, âdistinguishable parcelsâ.
This suggests that belief isn't just accepting a fact, and disbelief isn't just rejecting it. They are two separate, active, and synchronized brain statesâdifferent operating systems for reality. One is for processing stories we accept as true, and the other is for processing stories we reject as false.
The study also revealed a "belief bias": our brains' default setting seems to be belief. It takes active, effortful work to switch over to the "disbelief" operating system.
When we tell a compelling story, we aren't just conveying information; we are inviting others to tune their brains to the same frequency, to run the same cognitive program.
Political Theory 101
What's the fastest way to get a group of people to cooperate? Forget trust falls and team-building exercises. A new study suggests a much darker, and more effective, tool: threaten them.
When small groups of research participants played an economic game together, the threat of collective punishment (in this case, âelectric shocksâ) didn't just increase cooperationâit was sustained. The threat of an outside danger was a powerful catalyst for in-group solidarity, âreducing free-ridingâ and increasing ingroup prosociality.
The finding itself is perhaps not surprising; it aligns with a long history of "rally 'round the flag" effects. What's deeply unsettling is the clarity of the perverse incentive this creates for anyone in a position of leadership.
This research provides a formal, mechanistic justification for a very old and very toxic political playbook: if you want to consolidate power and ensure the loyalty of your in-group, the most reliable strategy is to manufacture a convincing external threat.
It's a stark reminder that the tools of cooperation and the tools of control are often one and the same.