Stretching the Expertise Bubble

Be wary of all-knowing experts.

by
https://cdn.psychologytoday.com/sites/default/files/styles/article-inline-half-caption/public/field_blog_entry_images/2020-05/soap-bubble-1361765_1.jpg?itok=WtjhBF2a
Source: Photo by netean netean from FreeImages

The COVID-19 pandemic continues to provide no shortage of conflicting information, misinformation, and conspiracy theories that can quickly make your head hurt. There is also no shortage of experts willing to pontificate on any number of pandemic-related issues, from predicting likely mortality to explaining people’s behavioral choices.

Two recent articles demonstrate the willingness of experts to attempt to explain human behavior during the pandemic. The first focuses on differences in people’s willingness to self-sacrifice, and the second seeks to explain why people make bad decisions about COVID-19. In both cases, the authors claim expertise in their respective areas, applying their knowledge to explain people’s decisions and resulting behavior.

I’m not here to argue whether the authors are actual experts, as I have no evidence to suggest they aren’t. However, in both cases, the articles point toward an issue worthy of consideration when it comes to determining the experts’ credibility – the expertise bubble.

The Expertise Bubble

All experts’ knowledge and experience only really apply directly to a finite amount of subject matters and/or in specific situations or settings – an expertise bubble. When experts offer insights about an issue in which their expertise bubble overlaps, they tend to produce high-quality insights – the expertise sweet spot. When their expertise bubble doesn’t truly overlap, the quality of those insights may be much more variable due to their expertise gap [1]. The bigger the gap, the more likely they are to produce lower-quality insights, especially if the expert unaware that a gap exists.

Consider, for example, someone very close to you, perhaps a spouse. You may have an extraordinary amount of experience with that individual in his/her role as spouse, to the point where you might very well be considered an expert on that person’s behavior (i.e., your spouse) within that context (i.e., the spousal role). If you have limited experience with your spouse in the context of other roles, such as his/her role as employee, then your expertise bubble may not extend as well (if at all) to your spouse’s behavior in those roles. Thus, if you were asked to offer insights about your spouse, those insights would be much higher quality when focusing on the spousal role (where your expertise bubble overlaps), but they would be much less likely to extend to that person’s work role (where you have an expertise gap) [2].

https://cdn.psychologytoday.com/sites/default/files/styles/article-inline-half-caption/public/field_blog_entry_images/2020-05/expertise_bubble_image.png?itok=f790gBPO
Source: Yours Truly

We can visualize the expertise bubble as a part of a Venn diagram. It forms the constraints around any one person’s legitimate expertise. Of course, there is quite a large world of problems or issues operating all around us, and the problems and contexts in which someone’s expertise fully applies is quite small [3].

Anyone with expertise possesses an expertise sweet spot. When the subject matter domain and the context of application align, an individual’s expertise can be quite insightful. However, the less overlap there is, the less likely that expertise applies.

This does not mean that an academic researcher who studies a particular topic within a laboratory has no expertise to offer (as being an expert has more to do with knowledge of the broader available evidence base than it does with what that person specifically studies). What it does mean is that if the evidence base comes strictly from a laboratory, it may have limited applicability and, therefore, may offer limited insight [4].

https://cdn.psychologytoday.com/sites/default/files/styles/article-inline-half-caption/public/field_blog_entry_images/2020-05/expertise_bubble_image1.png?itok=4UfqXRJd
Source: Yours Truly Again

This issue, though, is not limited only to laboratory research. Just because someone is an expert on stress in the nursing profession does not mean those same insights would apply to another field, such as accounting. The further someone tries to stretch the limits of his/her expertise, the less likely that expertise is to offer valid insights.

That brings us back to the issue of COVID-19. Obviously, it is an issue that has presented a host of new challenges. Many individuals with legitimate expertise in some area have seemingly scrambled to identify COVID-related topics about which to apply their expertise. Some experts (e.g., virologists) may have subject matter knowledge, but it may not apply as well to this virus (based on this virus’ idiosyncratic effects). Other experts (e.g., psychologists) may have expertise about human decision making, but limited expertise about the ways in which it applies during a global pandemic. Because of its novelty, the current pandemic has produced a situation in which not a single expertise bubble truly overlaps, resulting in many experts overstretching their expertise bubble.

The gap between various people’s expertise bubbles and the current situation differs widely. When that gap is larger, more stretching of the expertise bubble is required, resulting in potentially lower-quality insights. While some experts recognize this, tempering their assertions and claims when more stretching is required, many others fail to do so, making proclamations with far more certainty and confidence than they should.

Experts Are Not Know-It-Alls

There is a common fallacy to which people fall prey, especially in uncertain situations: appeal to authority. It is only a fallacy, though, when the claims being made by the expert are dubious or questionable (in other words, lack evidence to support them). Yet, all too often, people accept experts’ claims without questioning or considering, even in passing, the basis for those claims [5]. The more authority we perceive the expert to possess, the less likely we are to question the expert’s claims.

It is, of course, impossible to thoroughly research the claims of every expert out there. However, there are some heuristics we can use to help determine whether a bit of skepticism is warranted when we come across the claims made by experts.

This is by no means an exhaustive list, and some of these may be more valuable in some situations than others (as it is impossible to come up with a list that is universally applicable). You can, of course, feel free to suggest other useful heuristics in the comments.

[1] This may occur because their subject matter expertise doesn’t exactly relate and/or the contexts in which their expertise applies is different than the current context.

[2] Of course, if you also work with your spouse, your insights may be much more accurate.

[3] It was impossible to construct the diagram to scale, as either the size of the expertise bubble would have to be infinitesimally small or the size of the problem or issue bubble would have to take up an entire neighborhood worth of screens.

[4] This is a recurring issue in laboratory studies that seek to generalize to more practical contexts.

[5] Except when we tend to have strong opinions that contradict the expert, in which case we may be motivated to look for holes in those claims.