Saturday, February 17, 2024

We Keep Getting Signs of Expert Blunders

Recently I have been publishing a series of short videos on the topic of the errors of experts, which you can view by using the links here and here. My best post on this topic is my post "Disastrous Blunders of the Experts," which you can read here. The post discusses the following examples in which experts produced the most disastrous blunders:

Expert Fiasco #1: The Bay of Pigs Invasion

Expert Fiasco #2: The Vietnam War

Expert Fiasco #3: Eugenics

Expert Fiasco #4: The Housing Bubble of 2005, and Financial Meltdown of 2008

Expert Fiasco #5: Blunders of the Psychiatrists

Expert Fiasco #6: The Iraq War

Expert Fiasco #7: Vioxx

Expert Fiasco #8: The Opioid Overdose Epidemic

Expert Fiasco #9: Nuclear Weapons

The post also discusses quite a few other cases of the most disastrous blunders by experts, including the atomic testing fiasco (in which we were assured by experts that atomic testing was safe, with as many as 500,000 people dying from cancer caused by radiation from such testing), and also the COVID-19 blunders that probably resulted in more than 300,000 unnecessary deaths because of incompetent responses.  It is an open question whether the entire COVID-19 pandemic that killed millions was the result of overconfidence by gene-fiddling biology experts recklessly monkeying with viruses. 

It is not hard to find recent examples of blunders by experts.  One example is all the US military and US foreign policy experts who have unwisely supported providing super-destructive bombs to the State of Israel as it has engaged in an appalling bombing campaign in Gaza, resulting in more than 27,000 civilian deaths, mostly deaths of women and children, with innumerable other women and children being maimed or crippled, and as many as 500,000 put at risk of starvation, homelessness, severe malnutrition or severe lung damage from breathing dust from all the destroyed buildings.  With the help of such a blunder the appalling horrors of the October, 2023 Hamas attack have been dwarfed by a savage slaughter twenty times bloodier. Another example can be found in the recent World Economic Forum meeting. 

The World Economic Forum provides an annual report on global risks. After a meeting in Switzerland in January, this expert group recently released its 2024 report on global risks.  Early 2024 is a time when the situation in the Middle East seems like some time bomb that may explode, leading to a new world war, with the situation in Ukraine posing a similar danger. So what has the World Economic Forum listed as the biggest current economic risk?  The group of experts has decided that the biggest global risk over the next two years is: misinformation and disinformation. 

You are probably thinking: you must be joking. No, I'm not. This is literally what the World Economic Forum lists as the top global risk over the next two years.  Below is a visual from the report.  We see "misinformation and disinformation" at the top of the list of 2-year global risks.

expert incompetence

Here is the report's description of this "misinformation and disinformation" risk, which fails to make it sound like anything to lose much sleep over:

"Misinformation and disinformation (#1) is a new leader of the top 10 rankings this year. No longer requiring a niche skill set, easy-to-use interfaces to large-scale artificial intelligence (AI) models have already enabled an explosion in falsified information and so-called ‘synthetic’ content, from sophisticated voice cloning to counterfeit websites. To combat growing risks, governments are beginning to roll out new and evolving regulations to target both hosts and creators of online disinformation and illegal content. Nascent regulation of generative AI will likely complement these efforts. For example, requirements in China to watermark AI-generated content may help identify false information, including unintentional misinformation through AI hallucinated content. Generally however, the speed and effectiveness of regulation is unlikely to match the pace of development." 

This sounds like nothing much to worry about, compared to threats such as nuclear war, pandemics arising from labs engaging in reckless gene-splicing, and global warming. So what on Earth were these experts thinking when they decided to proclaim "misinformation and disinformation" as the #1 global risk? Eve Ottenberg speculates about a possibility:

"The assorted billionaire geniuses and official intellectual luminaries who gathered in Davos Switzerland January 15-19 proved, for those who doubted, that neither singly nor as a group could these...find their way out of a paper bag. Weighing the world’s fate in their well-manicured fingers, did they seem concerned about the Ukraine War morphing into nuclear catastrophe, or ditto for a wider Middle East war? They did not. Did they tear their beautifully coiffed hair and rend their designer ensembles over the prospect of the earth heating up like a pancake on a griddle due to uncontrolled climate change? A disaster caused by rich countries gobbling up and belching out burnt fossil fuels? Or did they mouth vague platitudes about extreme weather? Yes, bromides were their plat du jour.

The most immediate threat to humanity, according to this assemblage of well-groomed ... (who paid $52,000 apiece to join the World Economic Forum and then $19,000 each for a ticket to the Davos shindig), is misinformation or disinformation – you pick. After all, these bigwigs can take to their pate de foie gras-stocked bunkers if the planet succumbs either to nuclear winter or high temperatures inhospitable to human life. So of course, they regard speech, that is, free speech, as the main threat to their luxurious creature comforts. After all, someone might say something bad about these oligarchs! "

What we seem to have here is a great example of why experts so often go very badly wrong. Experts tend to exist in "echo chambers" where groupthink and herd effects may predominate. Such echo chambers can be found in the ivory towers of academia or the ideological enclaves that are the Pentagon and the White House. Within such an echo chamber people will tend to hear only people who belong to the same belief community, people who share the same ideology. Existing in such an ideological enclave, absurd or immoral opinions may be voiced, and may be regarded as great wisdom by anyone who looks around and sees other members of the belief community voicing such an opinion. 

Conferences have always been affairs that tended to promote dubious examples of groupthink. You can put a few hundred academics or a few hundred clergy members or a few hundred CEOs at some conference, and let them hobnob with each other. An attendee will soon get signals about which opinions are acceptable to the group and which opinions are taboo.  Such  signals can come in a variety of ways, such as the amount of applause that a particular speech gets, and snickers and groans that come from an audience when an unpopular opinion is stated. The conference has the effect of turning its attendees into rubber stamps of whatever silly idea may be perceived to be the majority opinion of its attendees. Then some report may be issued announcing the opinions of the attendees. The report should be distrusted because of sociological effects.  A better way to poll the opinions of the attendees at the very beginning of the conference, before any sociological effects came into play. 

In the article here, we have an example of how sociological effects such as herding behavior can lead tiny groups of experts to produce blundering results. A conference of neuroscientists was called on the very tiny topic of "representational drift." So-called "representational drift" is a cover-story phrase that neuroscientists have invented to excuse the failure of neuroscientists to produce consistent reports in favor of supposed non-genetic representations they claim to see in the brain (things that are almost certainly the result of mere pareidolia, as I discuss here).  Early in the conference attendees were polled about their thoughts on this concept of "representational drift," and a significant fraction issued dismissive opinions, as if they thought that no such thing really existed.  But by the end of the conference, according to the article, the minority group had vanished, and the attendees reported agreement. This seemed to be sociological effects at work.  The experts holding the minority opinion got the message -- fall in line, and go with the herd.  

It seems that by groupthink effects a consensus emerged. The consensus was the groundless opinion that there are non-genetic representations in the brain that are drifting about. A correct analysis would have been that there is no evidence for any non-genetic representations in the brain, and that the reported "drifting" occurs because of the unreliability of reports of such representations.  But we got a dumb opinion as the consensus. That often happens from little enclaves of experts where herd effects predominate. 

A key factor driving the opinions of experts is "social proof." Social proof is when the likelihood of someone adopting a belief or doing something becomes proportional to how many other people adopted that belief of did that thing. If we were to write a kind of equation for social proof, it would be something like this:

Social proof of belief or action (s) = number of people believing that or doing that (x) multiplied by the average prestige of such people (y) multiplied by how much such people are like yourself (z).

If lots of people adopt a belief or do some thing,  there will be a larger amount of social proof. If some of those people are famous or popular or prestigious or influential, there will be a larger amount of social proof. If some or lots of those people are like yourself, there will be a larger amount of social proof. So, for example, we might not be influenced if told that most Mongolians water their lawns every week, but if we live on Long Island, and we hear that most Long Island residents water their lawns every week, we may well start doing such a thing.

Given these factors, it is rather easy to see how erring overconfidence communities can get started in the academic world, even when the communities are rather tiny. A physics professor may advance some far-fetched theory, and get a few supporters among other physics professors. These few professors each has a high prestige, since our society has adulation for physics professors. If you are then another physics professor, you may be drawn into the overconfidence community which will already have two of the three “social proof” factors in its favor – because the few adherents are just like you, and are high-prestige people. So even with only a few believers, it may be possible for the overconfidence community to get started. The more people who start believing in the idea, the more of a “social proof” snowball effect is created.

When you belong to an overconfidence community, it can cast a spell on you, and make you accept bad reasoning you would never accept if you were outside of the community. Once you leave the community, there can be a kind of “the scales fall from your eyes” effect, and you can ask yourself: what was I thinking when I believed that?  In the future, as it becomes ever more clear that the members of overconfidence communities in academia are making unsound claims, and pretending to know things they don't actually know, there will be many people who drift out of such overconfidence communities, and experience “the scales fall from your eyes” moments. And in such moments the questions they will ask will be something like “what the hell was I thinking?” or “how could I have believed in something so unbelievable?”

A recent survey of experts about the origins of COVID-19 gives us some reasons for doubting the opinions of experts. The survey (mainly of virologists and epidemic experts, with about 15% being biosecurity experts) found that 21.5% thought that the cause of COVID-19 was a "research-related accident," with 77% percent saying a "natural zoonotic" event was the origin.  Anyone considering such a survey should remember that the community of virologists and epidemic experts is a vested interest, a group of stakeholders with career stakes affecting whether they would proclaim that COVID-19 had natural causes. The survey (Annex Table F3) asked the respondents about whether they were familiar with some of the key pieces of literature used by advocates of the different positions. The survey found that the vast majority of the experts (78%) were not familiar with one of the chief items of evidence used by advocates of the lab leak theory (the DEFUSE grant proposal that proposed risky gene-splicing research that might have produced something like the COVID-19 virus).  

We are left with an impression of experts who form an opinion when there are two sides, but who don't bother to study the main evidence presented by those who oppose the opinion they hold. Nothing could be less surprising. A failure to study evidence in defiance of your opinions is one of the chief characteristics of experts. For example, in general neuroscientists who believe that you are just your brain and that you don't have a soul tend to be people who have never bothered to seriously study the very abundant evidence suggesting that you do have a soul (such as the evidence for apparitions, out-of-body experiences and anomalous knowledge acquisition by mediums).  In the same survey,  33% of the respondents stated that they were familiar with a nonexistent study that the respondents had been asked about to test their honesty. 

No comments:

Post a Comment