Header 1

Our future, our universe, and other weighty topics


Tuesday, January 20, 2026

Problems With the Vickers Paper Polling Scientists About Extraterrestrial Life

 In 2025 there appeared the paper "Surveys of the scientific community on the existence of extraterrestrial life" by Vickers et. al. The paper had a survey on the topic of whether scientists believe in extraterrestrial life. Below are the results, from questions in which scientists were asked whether they think that on other planets there is life, complex life, or intelligent life. 

poll about scientist belief in extraterrestrial life

There are several problems with the paper and its survey, which I will now discuss.

Problem #1: A Voluntary Response Email Survey 

The survey worked this way: scientists were sent emails asking them to reply to the survey, and 44% of the scientists replied by answering the survey.  But it is known that surveys of this type tend to be unreliable measurements of opinion, particularly when they are about some controversial topic. The problem is that there may be a much higher tendency for people who believe in some controversial theory to respond to a survey that asks only about their belief in that theory, rather than ignore such a survey. 

For example, imagine you send a survey to scientists that is only on the topic of controversial Theory X, a theory which most scientists scorn or have no knowledge of. It could be that 80% of the scientists getting this email decide to ignore it, thinking to themselves something like this: "Theory X? To hell with that." But it might be very different for some small minority of scientists who believe in Theory X. After getting the email, they may recognize the survey as an opportunity to improve the status of Theory X in the scientific community; and they may therefore be much more likely to respond. 

I remember a time more than 40 years ago when I worked two full-time jobs for a period of months. My second full-time job was a temporary job working for the US Census Bureau in Boston. The US government wanted to find out what percentage of the population used the fishing and wildlife services supported by the US government.  Workers like me were given stacks of survey forms, each of which had the name of a randomly selected US citizen. My job was to call up such people, and insist that they answer the survey's questions over the phone, questions asking about how often they used the government-supported fishing and wildlife facilities. I would have to keep calling back later if someone claimed to be too busy to answer when I called.  I would very frequently get responses like this from annoyed people:

"Why are you bothering to ask me about such things? I have never gone fishing in my life, nor have I ever gone hunting. So the government shouldn't be asking me about such things!"

I had to explain to such people the concept of a survey of random people: that the only scientifically valid way to find out what percentage of people used the government's fishing and wildlife facilities was to ask randomly selected people about this topic, to keep asking until all of them answered the questions, and to be just as interested in getting "no" answers as "yes" answers.  The US Census Bureau knew how to do a scientifically valid survey. The people at that bureau knew that it never would have been valid to just advertise some survey about fishing and wildlife, and to record what percentage of people choosing to do the survey said that they used fishing and wildlife facilities.  If you did the survey that way, it might have been that most of the participants would have been those who loved to do fishing and hunting.  The survey might then have given very misleading results perhaps suggesting that most people in the US use the government's fishing and wildlife facilities, when in fact only a small minority of the population used such facilities.  

Too bad Vickers et. al. did not seem to have the same knowledge of the proper way to determine what percentages of scientists think a particular thing. You cannot find out what percentage of a scientist community believes in a theory by sending them a voluntary survey on only that theory, one that will tend to get more replies from those who believe in that theory.  For example, if you send physicists a survey about their belief in the controversial theory called string theory, there will tend to be a much higher response rate from people believing in string theory than those not believing in it. 

One way to reduce the problem mentioned above is to do a voluntary email survey asking about many different things, such as a survey asking 50 diverse questions. With such a survey there will not tend to be an effect in which believers in some controversial theory are much more likely to respond. 

Problem #2: A Slanting in the Questions

Professional pollsters know that the way that a survey question is asked can have a very great influence on what kind of response people give. For example, here are two questions about gun control you could ask people in the USA:

Question 1: Do you think people should give up their right to bear arms given them by the Second Amendment of the US Constitution?

Question 2: Do you support gun control to reduce all these terrible mass shootings that keep happening?

Question 1 is a question about gun control, one slanted to produce a "no" answer. Question 2 is also a question about gun control, but it is  slanted to produce a "yes" answer.

Now, what are the questions in the Vickers survey? The paper lists these survey questions:

(Statement S1 – ‘Life’): It is likely that extraterrestrial life (of at least a basic kind) exists somewhere in the universe.

(Statement S2 – ‘Complex Life’): It is likely that extraterrestrial organisms significantly larger and more complex than bacteria exist somewhere in the universe.

(Statement S3 – ‘Intelligent Life’): It is likely that extraterrestrial organisms with advanced cognitive abilities comparable to or superior to those of humans exist somewhere in the universe."

These set of questions are a terrible way to survey people about extraterrestrial life. The questions have a very strong bias, because they doubly-suggest the idea that the least complex extraterrestrial life  would be simple. Humans know of no type of life that is simple. Even one-celled bacteria are very complex organisms that require hundreds of types of protein molecules, each its own complex invention requiring a very special sequence of hundreds of very specially arranged amino acids. 

The first way in which the survey questions above slants things and introduces bias is by using the phrase "of at least a basic kind" in the line referring to Statement S1. The second way in which the survey questions above slants things and introduces bias is by using the phrase "complex life" in referring to Statement S2. This indirectly suggests that what is being talked about in Statement S1 is life that is not complex. But all known life is very complex. So we have a severely slanted set of questions in which subjects are being asked to whether they agree that life "of at least a basic kind" exist, with the very misleading insinuation that such life could exist without being complex. Using such phraseology is introducing severe bias, as if the scientists running the survey were trying to gin up a response as high as possible to the first question. 

Then there is the fact that the survey suggests a particular opinion rather than asking a neutral question. Instead of being asked neutral questions asking things like "is it likely or not likely that..." we have questions stating a particular belief and asking whether respondents agree. It is well-known by pollsters that questions asked that way tend to produce a higher agreement with the stated belief.  For example, if you have a survey asking "do you agree that Donald Trump is a good president" you will get higher answers than if you ask "what type of president is Donald Trump -- good, bad, or in the middle?"

What would an objective, well-designed survey have looked like? It might have neutral questions like this:

Question 1: What do you think about the chances that there is  microscopic life on some other planet? 

Answer 1: Likely

Answer 2: Unlikely

Answer 3: I don't know/no opinion

Question 2: What do you think about the chances that there is  visible multicellular life on some other planet, such as organisms with organs? 

Answer 1: Likely

Answer 2: Unlikely

Answer 3: I don't know/no opinion

Question 3: What do you think about the chances that there is  intelligent life on some other planet, life as intelligent as humans or more intelligent?

Answer 1: Likely

Answer 2: Unlikely

Answer 3: I don't know/no opinion

Problem #3: No secret ballot

As a general rule, we should pay little attention to any survey in which scientists are asked about whether they believe in something that allegedly most scientists believe in, whenever the survey is not a secret ballot poll. The reason is that when a poll is not a secret ballot survey, scientists may answer in some way that they think they are supposed to answer, fearing that they may get in trouble if they do not go along with the majority. 

We have no evidence that the survey in the paper "Surveys of the scientific community on the existence of extraterrestrial life" is a secret ballot survey. In this case the lack of a secret ballot is relatively minor, because there did not exist a common idea that scientists were expected to think one thing or another about whether extraterrestrial life exists. 

Problem #4: Describing the results with the word "consensus"

In my post "So Much Misleading Talk Occurs in Claims of a Scientific Consensus," I discuss the word games that scientists play when they use the notoriously slippery, problematic and ambiguous word "consensus."  The problem is that "consensus" is a word with a double meaning. The Merriam-Webster dictionary gives us two definitions of "consensus" that disagree with each other. The first definition is "general agreement; unanimity." The second definition is "the judgment arrived at by most of those concerned." The first definition specifies 100%, and the second definition merely means 51% or more. 

Scientists very frequently exploit this ambiguity in a misleading way, just as they very frequently exploit ambiguity in the word "evolution" (another term with different meanings). Scientists may brag about a "consensus" on some topic when there is no evidence that even 90% of scientists agree on the topic.  They thereby are trying to create the impression in many people's minds that scientists agree on some topic, when no such agreement exists. 

We have an example of this in the paper I am discussing. It ends up bragging of "a significant consensus that extraterrestrial life likely does exist," even though its poll shows a fair fraction of the responding astrobiologists failing to declare that they "agree" or "strongly agree" with the idea that extraterrestrial life exists, and about a third of the responding astrobiologists failing to declare that they "agree" or "strongly agree" with the idea that "complex" extraterrestrial life exists. Given the ambiguity in the word "consensus," and that half of the public interprets that word to mean "unanimity of opinion," such a belief level should never be described as a consensus; it should merely be described as a majority opinion. In fact, because of problems discussed above, we do not even have here any very strong evidence that most astrobiologists or biologists agree that extraterrestrial life probably exists.  A well-designed set of neutrally-worded poll questions without the question bias documented above might well have shown fewer than half of the respondents agreeing that visible  extraterrestrial life exists. 

A poll like this would not mean much

No comments:

Post a Comment