Author: David Robson
Very important in making decisions is a psychological view of one’s own leadership. Namely a view that prevents you from falling for the “intelligence trap”. What exactly is that?
In fact, the intelligence trap is a direct result of some people’s inability to think outside of their expectations, to come up with an alternative view of the world where their decision is wrong rather than right. People with very high intelligence are more subjected to this than people with moderate intelligence. Nobel laureates sometimes suffer from this, so often that there is a term for it: “Nobel disease ”.
An active intervention to this pitfall can make use of so-called “ evidence- based wisdom ”. This can be imparted to all ages and to anyone, although one is more likely to benefit from it with moderate intelligence.
A first step is to understand what wisdom is. A definition of “wisdom” is: “he is wise who recognized the limits of his own knowledge” (Socrates) . Apart from that, good factual knowledge and training remain important.
Very dangerous is the fragility of the expert. For example, by relying heavily on schemes and protocols, which one must have, he may have trouble adapting to changes in the environment. Flexibility is therefore important. Personal bias is also a problem. One step towards a solution is to adapt the own thinking. This can already be done by reading about it, with inspiring examples. Placing a beginner with another, detailed, view is also an advantage because he or she does not yet know the patterns and therefore sees and can indicate differences in details of the case with regard to the general rule. It can also be done by taking a distant position. An example of this is the listing and quotation of important aspects of a situation or object, spread over several days. In addition, being able to listen to your own emotional compass is an advantage. This has to do with being able to relate the events in your environment and your (gut) feeling in the right way. After all, being able to spot deliberately crafted bullshit is a necessary skill. To this end, the author provides a list of some methods by which false truths are sometimes told.
What can save us from the pitfall are: cognitive reflection, intellectual humility, active “open- minded ” thinking, curiosity, refined emotional awareness and a “ growth mindset ”. We find these things in the nine virtues of the Intellectual Virtues Academy. These are divided into three categories as follows:
Curiosity: the opportunity to be amazed and to investigate and ask the “why” question. A thirst for understanding and a desire to explore.
Intellectual humility: the willingness to recognize one’s own limits and mistakes, regardless of intellectual status or prestige.
Intellectual Autonomy: Having the ability for active autonomous and self-guided thinking. The ability to reason and think for oneself.
Attention: to be there with your thoughts 100% on the matters of the learning process. Keep distractions at bay. Be with the thoughts and commitment completely on top of the topic.
Intellectual carefulness: the ability to notice and avoid intellectual pitfalls. A commitment to accuracy.
Intellectual thoroughness: the ability to seek and find explanations. Dissatisfaction with rather apparent or superficial and (too) simple explanations. Reaching for a deeper meaning and understanding.
Open- mindedness : an capacity to think outside the box. Responds honestly to competing perspectives.
Intellectual Courage: Being ready to persevere in thinking or communicating with the risk of fear of being embarrassed or of failure.
Intellectual tenacity: a will to face an intellectual challenge and struggle. Keep your eyes on the prize and don’t give up.
These aspects of the eternal learning mindset apply to an individual, but how do you put together a ‘dream team’? It depends.
If you have a team where everyone has to do things clearly separately, without overlapping job content, then you can use a team of top players: there is no competition.
If you have to put together a team of competitors, where there is overlapping job content , then it is important that they are not all toppers, only about 60% are toppers, but then teamwork, being in tune with each other, weighs more.
With a crisis team, any type of crisis team, it’s the best of both worlds. In a CRT (Crisis Respons Team), for example the company fire service team, it is clearly the second. At a CET (Crisis Expert Team, where a team member might handle a file from A to Z , it might be the first. But with the CMT (Crisis Management Team), where there is little or no overlap between the participants, but there is still a need to work together because people must be able to rely on each other’s results, it is a pure cross: you need 100% toppers, but they also have to be able to work with each other. In the latter case, intellectual humility is an issue, because in this team one often has to deal with high profiles who “know very well what they are worth”. Antibodies against this are exercises in which people learn to share information and are assessed on the integration of each other’s point of view in their own thinking.
A change of mentality that can be useful to contribute positively to this is to initiate discussions throughout the hierarchy, between the different layers of the hierarchy, and to recognize and hear people as experts in the field of, for example, their own ideas about occupational safety.
Also not to be ignored is the use of statistics from near misses. After all, it has been statistically proven that a serious event is preceded by a number of near misses. That was the case with the Challenger , with the Columbia, (NASA) but actually also Covid-19 was preceded by, among others, SARS and MERS. There was a failure to learn lessons and to implement them into the future, or to persist in these lessons learned. For that you need the mentality of a “high reliability ” organization. It has been shown for this type of organization (research by Karl Weick & Kathleen Sutcliffe ) that they exhibit the following characteristics:
Expect to Fail: Employees go to work and every day can be a bad day. But the organization rewards employees for reporting their mistakes.
Reluctance to simplify interpretations: employees are rewarded for questioning assumptions, and for being skeptical of the wisdom of others.
Sensitivity to operations: Team members continue to communicate and interact to increase their understanding of the situation and look for the actual origin of each anomaly.
Commitment to resilience : acquiring the necessary knowledge and resources to bounce back after a negative event. This includes the ‘pre mortems ‘ and the discussions of near misses.
Respect for expertise: here the open communication between different layers of the hierarchy is important, and the intellectual humility of those at the top.