Why We Follow the Crowd: The Herd Behavior

Ever wondered why people sometimes act against what they clearly see or know? Why a crowd can suddenly turn aggressive, or why large groups support ideas that don’t really hold up under scrutiny?

It’s easy to assume that people think independently. But in reality, human behavior is deeply social. We watch others. We copy others. And more often than we realize, we let others decide what feels right.

As behavioral economist Richard Thaler points out in Misbehaving, humans are constantly being “nudged” by other humans. These nudges don’t have to be loud or forceful. Sometimes, they are subtle – just a comment, an action, or even silence. Yet, they can trigger massive social shifts.

This is where mob mentality, or herd behavior, comes into play.

What Is Herd Behavior?

Herd behavior refers to the tendency of individuals to mimic the actions of a larger group, often ignoring their own beliefs, logic, or even direct evidence. It’s the same instinct that helps animals survive in groups, but in humans, it can lead to both cooperation and chaos.

In markets, herd behavior can inflate bubbles. In politics, it can fuel movements. On the streets, it can turn peaceful protests into violent outbreaks. But why does this happen?

To understand that, we need to look at two powerful psychological forces: collective conservatism and pluralistic ignorance.

Collective Conservatism: When the Past Controls the Present

Collective conservatism is the tendency of groups to stick to established patterns, even when those patterns no longer serve a purpose. Once a behavior becomes “normal,” it gains a kind of momentum. People stop questioning it. It becomes automatic.

A clear example can be seen in certain political protests. In many places, including Nepal, vandalism has become an almost expected part of large demonstrations. Shops are damaged. Public property is destroyed. Roads are blocked.

But here’s the important question: Is vandalism necessary for protest?

Not really.

In fact, peaceful protests are often more effective in gaining public support and legitimacy. Yet, once aggressive behavior becomes associated with protests, it tends to repeat itself. People entering the crowd assume this is how protests work. They follow along. It’s not always about intent. Many individuals in such situations may not personally support violence. But they adapt to the group’s behavior.

Why? Because deviating feels risky. Going against the crowd requires confidence and courage. Following the crowd requires neither. Over time, these repeated patterns solidify into tradition, even if they started without strong justification.

Pluralistic Ignorance: The Silent Misunderstanding

If collective conservatism explains why behaviors persist, pluralistic ignorance explains why no one stops them. Pluralistic ignorance occurs when most people in a group privately reject a norm but assume others accept it. As a result, everyone goes along with it.

This creates a powerful illusion: it looks like there is widespread agreement, when in reality, there isn’t.

Let’s take a politically sensitive example. In times of frustration with democratic systems: corruption, inefficiency, instability, some people begin to romanticize alternatives, like monarchy. The idea gains visibility. Social media amplifies it. Public conversations pick it up.

Now ask: Do most people genuinely believe monarchy is the solution? Often, no.

Many understand the historical limitations and risks. But here’s where pluralistic ignorance steps in. Individuals see others expressing support and assume, “Maybe I’m the only one who disagrees.” So they stay quiet.

Meanwhile, others are thinking the same thing. But because no one speaks up, the illusion of support grows stronger. Eventually, more people jump on the bandwagon, not out of conviction, but out of perceived consensus. This is how weak ideas can appear strong.

Why Do People Ignore Their Own Senses?

At the core of herd behavior is a simple but uncomfortable truth: People don’t always trust their own judgment, especially in groups.

There are a few key reasons for this:

1. Social Pressure: Humans are wired to belong. Being part of a group feels safe. Standing out feels dangerous. Even when evidence is clear, going against the group can feel like a social risk. So people conform.

2. Fear of Isolation: Disagreement can lead to rejection, criticism, or conflict. In uncertain environments, like political protests or public debates, people often choose silence over confrontation.

3. Assumption of Collective Wisdom: There’s a common belief: “If everyone is doing it, there must be a reason.” Sometimes that’s true. But not always. Crowds can be wrong. History has shown this repeatedly from financial crashes to political movements that later collapse under scrutiny.

4. Emotional Contagion: Emotions spread quickly in groups. Anger, excitement, fear – they amplify each other. In a charged environment, rational thinking often takes a back seat. This is why peaceful gatherings can suddenly escalate. People get swept up in the moment.

Real-World Consequences of Herd Mentality

Herd behavior isn’t just a theory, it has real consequences.

  • In politics, it can lead to the rise of movements that lack strong foundations but gain momentum through perception.
  • In protests, it can turn legitimate causes into destructive events, undermining the very message people want to send.
  • In society, it can normalize behaviors that most individuals don’t actually support.

Perhaps the most concerning part is how invisible it can be. People often believe they are acting independently, when in fact they are responding to social cues.

Breaking the Cycle

If herd behavior is so powerful, is there any way to resist it? Yes, but it starts with awareness.

The first step is simple: pause and question.

  • Do I actually believe this?
  • Or am I assuming others do?
  • What evidence am I ignoring?

Even small acts of questioning can disrupt the cycle. Speaking up also matters. When one person voices doubt, it often gives others permission to do the same. What seemed like a united front can quickly reveal cracks. This is how change begins, not always with large movements, but with small shifts in perception. Humans are not mindless followers. But we are deeply influenced by each other. Our decisions are shaped not just by facts, but by what we think others believe. That’s what makes herd behavior so powerful and so dangerous.

Collective conservatism keeps outdated patterns alive. Pluralistic ignorance hides the truth about what people actually think. Together, they create a world where people may act against their own understanding, simply because “everyone else” seems to be doing it.

But here’s the good news: These patterns can be broken. It starts with noticing. It grows with questioning. And it spreads when even one person chooses to think and act independently.

Because sometimes, the bravest thing you can do in a crowd…
is not follow it.

Featured Image Credits: Niranjan Shrestha / Associated Press

“I know, thanks” – The Earned Dogmatism Effect

Dogma, in the broad sense, is any belief held unquestioningly and with undefended certainty. It’s a point of view that people are expected to accept because it is put forth as authoritative without adequate grounds. This helps us understand more about the ‘Earned Dogmatism Effect‘ – which tells us that being labeled as an “expert” may contribute to us being close-minded.

In a study with six experiments, Victor Ottati, Erika D. Price, Chase Wilson from Loyola University Chicago and Nathanael Sumaktoyo from University of Notre Dame tested the Earned Dogmatism Hypothesis, and concluded that experts are entitled to adopt a relatively dogmatic, closed-minded orientation. As a consequence, situations that engender self-perceptions of high expertise elicit a more closed-minded cognitive style.

Inflated Scores

In one of the tests, participants were randomly assigned to the easy (successful) or difficult (failure) political test. Fifteen multiple choice questions were asked, with questions in the easy condition being, “Who is the current President of the United States?“, to equivalent question in the difficult condition being “Who was Nixon’s initial Vice-President?“.

Upon completing the test, participants were provided with false and inflated scores. Participants in the easy (successful) condition were told that they performed better than 86% of the other test takers; whereas participants in the difficult (failure) condition were told they performed worse than 86% of the test takers.

The participants in the difficult (failure) condition expressed greater political open mindedness than those in the easy (successful) condition. This went on the prove that even the higher self-perceived expertise created an effect of cognition blockade into themselves. Those people who had the impression that they were relatively expert on a certain topic (even when they were given inflated scores), led them to be less willing to consider others’ viewpoints – as stated by this earned dogmatism effect.

President Obama’s Policies

In another test conducted by Ottari and team, participants were asked to enlist either two (easy case) or ten (difficult case) policies implemented by the then US President, Barack Obama. Participants were randomly assigned to the easy or difficult case. In the easy case, participants were allowed to advance to next screen as long as they described one policy. In the difficult case, the participants were asked to write ten policies signed by Obama, or if they couldn’t name ten, they were instructed to write “I don’t know” in the remaining text boxes.

The result? All participants in the easy condition named at least one policy and more than half of the participants named two policies. In the difficult condition, participants named an average of four policies. As predicted by the Earned Dogmatism Effect, participants in the difficult condition reported greater openness to political open mindedness, while participants in the easy condition had less openness to other political opinions.

The Conundrum of Confidence & Competence

The top rated professor at Wharton for seven straight years, Adam Grant, says, “We need to stop mistaking confidence for competence.

The problem is that we equate confidence with competence. But they’re very different things. Unjustified confidence is a form of incompetence, and likewise, competence doesn’t really justify the confidence.

The Earned Dogmatism Effect - Confidence vs Competence

In Grant’s recently published book, “Think Again“, he describes two major syndromes – armchair quarterback syndrome and imposter syndrome – with the difference of these two things – competence and confidence.

When confidence is greater than competence, we fall victim of armchair quarterback syndrome, when we become blind to our own weakness. The opposite of armchair quarterback syndrome, imposter syndrome, is where competence exceeds confidence.

So where do we begin then?

In between the two syndromes, we have the sweet spot of confident humility zone. The right balance between competence as well as confidence brings out the best within us, allowing us to dodge the tricky earned dogmatism effect.

The Earned Dogmatism Effect - Confidence vs Competence
Confidence vs Competence, Adapted from Adam Grant’s book “Think Again”

No one likes an arrogant expert. Being definite, confident, and certain are all good things for conveying competence, but being dogmatic, narrow, and inflexible can limit the credibility and usefulness of the expert. 

To start with, we need to think deliberately how we can be wrong. Of course it is hard for our biased brain to scan our wrongness ourselves. To avoid such biases, we can reframe the question as, “How can others be right?“. Asking the question would not really prevent us from escaping our wrongness, but helps to understand a different perspective where two rights can exist.

Countless studies have shown that most of us overestimate our understanding of various topics, everything from how a vacuum cleaner works to the detail of political policies – a phenomenon explained by ‘the illusion of explanatory depth’. It is essential for us to understand and establish a realistic sense of our own knowledge. A simple way to address this intellectual overconfidence is to make the effort to explain a relevant issue or topic to yourself or someone else in detail, either out loud or in writing. This exercise makes us aware about the gaps in our knowledge, making it more apparent, thereby breaking this illusion of expertise.

To understand another way to combat our illusion of expertise, we need to explore one of the mental models given by Josh Waitzkin, a chess prodigy and an author of the book, “The Art of Learning“. Waitzkin tells:

It’s so easy to think that we were in the dark yesterday but we’re in the light today… but we’re in the dark today too.

Josh Waitzkin, author, “The Art of Learning

The same way we look back at five years younger ourselves and laugh about how stupid we were, we will definitely look back at today five years from now and laugh again. We commonly go on to say, “I didn’t know before, but I know today”. This only tells us that we don’t know something today as well, which we will know tomorrow. This realization will definitely help us to break the illusion, and come above the earned dogmatism effect.

In conclusion, the next time your back of the head tells you during a conversation, “I know, thanks”, tell the biased brain, “You don’t know everything, so let me listen.”

Also read: Where even experts can go wrong?