Can too many cooks not spoil the broth?

Too many cooks spoil the broth” — this popular idiom implies that when too many people work together or are involved in an activity, the final outcome or result becomes inferior.

While we all agree to the idioms with our versions of experiences, a popular management term “synergy” comes into mind. On a flipped perspective of the idiom, shouldn’t multiple cooks be rather reinforcing the broth with their own camaraderie and united strength than spoiling it?

This proverbial expression metaphorically denotes employing excess resources causes inefficiency. On a more literal term, too many cooks might not always spoil the broth. It is having multiple inputs from too many people that derails the progress. Contrary to the belief, there can be multiple number of people and less number of inputs. Sounds superficial and counterintuitive, but it is not impossible to achieve.

The Earned Dogmatism Effect: Culprit for the Spoiled Broth

“We don’t know everything, and we probably never will.”

This sentence sums up the antidote to the earned dogmatism effect, which explains that as we start becoming more experienced and knowledgeable – and thus, move from amateur to expert, we start becoming more close-minded and adopt a relatively dogmatic orientation – inclined to lay down certain beliefs as incontrovertibly true. Little do we know, multiple truths exist.

Read in detail about ‘The Earned Dogmatism Effect’ here.

When the cooks start believing that their way is the “ultimate right” way, the broth gets spoiled. If the cooks (or anyone in general) become more self-aware of their own dogmas and its impact on the bigger picture (i.e., the spoiled broth), they can help prevent this accident. Of course it is against the common human nature to not add recommendations in such quandary. But this should not be seen as a “sacrifice” because one does not necessarily have to put forward their ideas and recommendations all the time.

How to Save the Broth?

Here are a few steps (out of many) to help us save the broth and move towards achieving ‘synergic’ results.

1. Self Awareness

Checking in with yourself always helps. As the popular saying mentions, “the only way out is in“. It becomes important to understand how our thoughts, emotions, and actions are ever evolving and changing as we learn, and grow more with education and experience in life. The more we know, the more likely we are to fall into the diagnosis pitfall. The diagnosis pitfall notes that the “experts” at times are blinded by their past experiences, and could be fixated on the new event being the same as their past events. When we tend to selectively focus only on a part of the event that triggers our inner advice monster, we succumb into this trap of diagnosis pitfall. 

Read in detail about ‘The Diagnosis Pitfall” here.

As credible and knowledgeable experts, it becomes easy for us to advice people irrespective of their need for the guidance. Taming our inner advice monster is essential, and so it understanding that advice giving is not the problem.

Advice giving becomes problematic when i) we fail to understand the real depth of the challenge or the problem, ii) we think our advice is amazing when it might not be (Knock, knock: The Dunning-Kruger Effect), and iii) most frustratingly we cut away the other person’s sense of confidence and autonomy by trying to be a messiah or savior with our advices.

Self awareness helps us to check our biases within us, and realize that these cognitive biases can be problematic not just for us, but for the overall team and the outcome of the project/activity.

2. The golden pyramid of conscious and empathic listening

The golden trident of listening effectively comprises of three components – i) Understanding, ii) Humility, and iii) Curiosity.

Listening to someone should be more about understanding, and not about responding or reacting. And to make things clear, understanding does not mean agreement. We can develop this amazing ability to listen to someone say a complete opposing view without agreeing to them, but trying to understand where they are coming from.

The second component is about having the intellectual humility, since we do not have the knowledge of everything in the world. Even if we might have mastered cooking, we might not have full comprehension about all the dishes of the world. This humility allows us to listen rather than recommend more inputs to spoil the broth. Finally, the third component is about curiosity. Curiosity can be summed up with two words – asking questions.

Imagine you were about to clean the dishes voluntarily. Then someone comes in and then asks you to clean the dishes. You, now, might still do it but not as wholeheartedly as you would have done it before. Now imagine that instead of that someone ordering you to do it, they come up to you and simply asks, “what are you about to do?” Your probable response would be, “I’m about to do the dishes”. You might find the difference in your thoughts, emotions, and actions while doing the dishes now.

As human beings, being asked questions is a way to open up discussions and create platform to express ourselves. Asking questions implies that the other person is curious to listen, know, and understand about your views. Asking questions and staying curious in the conversation is more likely to push you to a listening zone. As the principle of reciprocity goes, when you listen to someone, you get listened to as well.

3. Imposed gets opposed.

Finally, when we try to impose our ideas and recommendations on others, we can expect it to be challenged, criticized, and even opposed.

Imagine your vegan friend pressing you hard to leave your juicy steak and turn into adopting a plant based diet. Imagine a religious priest avouching you to turn into following a certain religion, or trying hard to turn your atheist views to believing in god. Imagine someone with high inclination towards alternative medicine trying to influence and persuade you into following their methods. When you feel these things being imposed on you, you won’t budge no matter how much of logical statements they make, or how much evidences they present to you. They simply come across as “logic bullies“.

Any idea or change that is imposed will largely get opposed. In order to save the broth, we need to remain mindful that we are not imposing our inputs and recommendations to others. Understanding this simple rule will help us to easily get things done through our teams and groups.

Even the recommendations mentioned here in this article are not imposed; people are free to practice their own will. Just don’t impose it to others.

“I know, thanks” – The Earned Dogmatism Effect

Dogma, in the broad sense, is any belief held unquestioningly and with undefended certainty. It’s a point of view that people are expected to accept because it is put forth as authoritative without adequate grounds. This helps us understand more about the ‘Earned Dogmatism Effect‘ – which tells us that being labeled as an “expert” may contribute to us being close-minded.

In a study with six experiments, Victor Ottati, Erika D. Price, Chase Wilson from Loyola University Chicago and Nathanael Sumaktoyo from University of Notre Dame tested the Earned Dogmatism Hypothesis, and concluded that experts are entitled to adopt a relatively dogmatic, closed-minded orientation. As a consequence, situations that engender self-perceptions of high expertise elicit a more closed-minded cognitive style.

Inflated Scores

In one of the tests, participants were randomly assigned to the easy (successful) or difficult (failure) political test. Fifteen multiple choice questions were asked, with questions in the easy condition being, “Who is the current President of the United States?“, to equivalent question in the difficult condition being “Who was Nixon’s initial Vice-President?“.

Upon completing the test, participants were provided with false and inflated scores. Participants in the easy (successful) condition were told that they performed better than 86% of the other test takers; whereas participants in the difficult (failure) condition were told they performed worse than 86% of the test takers.

The participants in the difficult (failure) condition expressed greater political open mindedness than those in the easy (successful) condition. This went on the prove that even the higher self-perceived expertise created an effect of cognition blockade into themselves. Those people who had the impression that they were relatively expert on a certain topic (even when they were given inflated scores), led them to be less willing to consider others’ viewpoints – as stated by this earned dogmatism effect.

President Obama’s Policies

In another test conducted by Ottari and team, participants were asked to enlist either two (easy case) or ten (difficult case) policies implemented by the then US President, Barack Obama. Participants were randomly assigned to the easy or difficult case. In the easy case, participants were allowed to advance to next screen as long as they described one policy. In the difficult case, the participants were asked to write ten policies signed by Obama, or if they couldn’t name ten, they were instructed to write “I don’t know” in the remaining text boxes.

The result? All participants in the easy condition named at least one policy and more than half of the participants named two policies. In the difficult condition, participants named an average of four policies. As predicted by the Earned Dogmatism Effect, participants in the difficult condition reported greater openness to political open mindedness, while participants in the easy condition had less openness to other political opinions.

The Conundrum of Confidence & Competence

The top rated professor at Wharton for seven straight years, Adam Grant, says, “We need to stop mistaking confidence for competence.

The problem is that we equate confidence with competence. But they’re very different things. Unjustified confidence is a form of incompetence, and likewise, competence doesn’t really justify the confidence.

The Earned Dogmatism Effect - Confidence vs Competence

In Grant’s recently published book, “Think Again“, he describes two major syndromes – armchair quarterback syndrome and imposter syndrome – with the difference of these two things – competence and confidence.

When confidence is greater than competence, we fall victim of armchair quarterback syndrome, when we become blind to our own weakness. The opposite of armchair quarterback syndrome, imposter syndrome, is where competence exceeds confidence.

So where do we begin then?

In between the two syndromes, we have the sweet spot of confident humility zone. The right balance between competence as well as confidence brings out the best within us, allowing us to dodge the tricky earned dogmatism effect.

The Earned Dogmatism Effect - Confidence vs Competence
Confidence vs Competence, Adapted from Adam Grant’s book “Think Again”

No one likes an arrogant expert. Being definite, confident, and certain are all good things for conveying competence, but being dogmatic, narrow, and inflexible can limit the credibility and usefulness of the expert. 

To start with, we need to think deliberately how we can be wrong. Of course it is hard for our biased brain to scan our wrongness ourselves. To avoid such biases, we can reframe the question as, “How can others be right?“. Asking the question would not really prevent us from escaping our wrongness, but helps to understand a different perspective where two rights can exist.

Countless studies have shown that most of us overestimate our understanding of various topics, everything from how a vacuum cleaner works to the detail of political policies – a phenomenon explained by ‘the illusion of explanatory depth’. It is essential for us to understand and establish a realistic sense of our own knowledge. A simple way to address this intellectual overconfidence is to make the effort to explain a relevant issue or topic to yourself or someone else in detail, either out loud or in writing. This exercise makes us aware about the gaps in our knowledge, making it more apparent, thereby breaking this illusion of expertise.

To understand another way to combat our illusion of expertise, we need to explore one of the mental models given by Josh Waitzkin, a chess prodigy and an author of the book, “The Art of Learning“. Waitzkin tells:

It’s so easy to think that we were in the dark yesterday but we’re in the light today… but we’re in the dark today too.

Josh Waitzkin, author, “The Art of Learning

The same way we look back at five years younger ourselves and laugh about how stupid we were, we will definitely look back at today five years from now and laugh again. We commonly go on to say, “I didn’t know before, but I know today”. This only tells us that we don’t know something today as well, which we will know tomorrow. This realization will definitely help us to break the illusion, and come above the earned dogmatism effect.

In conclusion, the next time your back of the head tells you during a conversation, “I know, thanks”, tell the biased brain, “You don’t know everything, so let me listen.”

Also read: Where even experts can go wrong?