This
year scientist Fred C. Adams published to the physics paper server a
massive paper on the topic of cosmic fine-tuning (a topic I have
often discussed on this blog). The paper of more than 200 pages (entitled "The Degree of Fine Tuning in our Universe -- and Others") describes many cases in which our existence depends on cases of some
number in nature having (against all odds) a value allowing the
universe to be compatible with the existence of life. There are several
important ways in which the paper goes wrong or creates inappropriate
impressions. I list them below.
Problem
#1: Using habitability as the criteria for judging fine-tuning,
rather than “something as good as what we have.”
Part
of the case for cosmic fine-tuning involves the existence of stars.
It turns out that some improbable set of coincidences have to occur
for a universe to have any stars at all, and some far more improbable
set of coincidences have to occur for a universe to have very stable,
bright, long-lived stars like our sun.
Adams repeatedly attempts to convince readers that the universe could have had fundamental constants different from what we have, and that the universe would still have been habitable because some type of star might have existed. When reasoning like this, he is using an inappropriate rule-of-thumb for judging fine-tuning. Below are two possible rules-of-thumb when judging fine-tuning in a universe:
Rule
#1: Consider how unlikely it would be that a random universe
would have conditions as suitable for the appearance and long-term
survival of life as we have in our universe.
Rule #2: Consider only how unlikely it would be that a random universe would have conditions allowing some type of life.
Rule
#1 is the appropriate rule to use when considering the issue of
cosmic-fine tuning. But Adams seems to be operating under a
rule-of-thumb such as Rule #2. For example, he tries to show that a
universe significantly different from ours might have allowed red
dwarf stars to exist. But such a possibility is irrelevant. The
relevant consideration: how unlikely is it that we would have got a
situation as fortunate as the physical situation that exists in our
universe, in which stars like the sun (more suitable for supporting life than red-dwarf stars) exist? Since "red dwarfs are far more variable and violent than their more stable, larger cousins" such as sun-like stars (according to this source), we should be considering the fine-tuning needed to get stars like our sun, not just any type of star like a red dwarf.
I
can give an analogy. Imagine I saw in the woods a log cabin house. If I am judging whether this structure is the result of chance
or design, I should be considering how unlikely it would be that
something as good as this might arise by chance. You could do all
kinds of arguments trying to show that it is not so improbable that a
log structure much worse than the one observed might not be too
unlikely (such as arguments showing that it wouldn't be too hard for a few falling trees to make a primitive rain shelter). But such arguments are irrelevant. The relevant thing to
consider is: how unlikely would it be that a structure as good as the
one observed would appear by chance? Similarly, living in a
universe that allows an opportunity for humans to continue
living on this planet for billions of years with stable solar
radiation, the relevant consideration is: how unlikely that a
universe as physically fortunate as ours would exist by chance?
Discussions about how microbes might exist in a very different
universe (or intelligent creatures living precariously near unstable suns) are irrelevant, because such universes do not involve physical conditions
as good as the one we have.
Problem
#2: Charts that create the wrong impression because of a “camera near the needle
hole” and a logarithmic scale.
On page 29 of the paper Adams gives us a chart showing some fine-tuning needed for the ratio between what is called the fine-structure constant and the strong nuclear force. We see the following diagram, using a logarithmic scale that exaggerates the relative size of the shaded region. If the ratio had been outside the shaded region, stars could not have existed in our universe.
This
doesn't seem like that lucky a coincidence, until you consider that
creating a chart like this is like trying a make a needle hole look
big by putting your camera right next to the needle hole. We know of
no theoretical reason why the ratio described in this chart could not
have been anywhere between
.000000000000000000000000000000000000000000001 and
1,000,000,000,000,000,000,000,000,000,000,000. So by using such a
narrow scale, the chart gives us the wrong idea. In a a less misleading
chart that used an overall scale vastly bigger, we would see this
shaded region as merely a tiny point on the chart, occupying less
than a millionth of the total area on the chart. Then we would
realize that a fantastically improbable coincidence is required for nature to have threaded this needle hole. Adams also uses a logarithmic scale for figure 5, to make another such tiny "needle hole" (that must be threaded for life to exist) look like it is relatively big.
Needle holes can look big when your eye is right next to them
On page 30 of the paper, Adams argues that the strong nuclear force (the strong coupling constant) isn't terribly fine-tuned, and might vary by as much as 1000 without preventing life. He refuses to accept what quite a few scientists have pointed out: that it would be very damaging for the habitability of the universe if the strong nuclear force were only a few percent stronger. Quite a few scientists have pointed out that in such a case, the diproton (a nucleus consisting of two protons and no neutrons) would be stable, and that would drastically affect the nature of stars. Adams attempt to dismiss such reasoning falls flat. He claims on page 31 that if the diproton existed, it would cause only a “modest decrease in the operating temperatures of stellar cores," but then tells us that this would cause a 1500% change in such temperatures (a change from about 15 million to one million), which is hardly modest.
Adams
ignores the fact that small changes in the strong nuclear force would
probably rule out lucky carbon resonances that are necessary for
large amounts of carbon to exist. Also, Adams ignores the
consideration that if the strong nuclear force had been much
stronger, the early universe's hydrogen would have been converted
into helium, leaving no hydrogen to eventually allow the existence of water. In this paper, two physicists state, “we show that
provided the increase in strong force coupling constant is less than
about 50% substantial amounts of hydrogen remain.” What that
suggests is that if the strong force had been more 50% greater, the
universe's habitability would have been greatly damaged, and life probably would have been impossible. That means
the strong nuclear force is 1000 times more sensitive and fine-tuned
that Adams has estimated.
Problem
#4: An under-estimation of the fine structure constant's sensitivity
and sensitivity of the quark mass.
On page 140 of the paper, Adams suggests that the fine structure constant (related to the strength of the electromagnetic force) isn't terribly fine-tuned, and might vary by as much as 10000 times without preventing life. His previous discussion of the sensitivity of the fine structure constant involved merely a discussion of how a change in the constant would affect the origin of elements in the Big Bang. But there are other reasons for thinking that the fine structure is very fine-tuned, reasons Adams hasn't paid attention to. A stellar process called the triple-alpha process is necessary for large amounts of both carbon and oxygen to be formed in in the universe. In their paper “Viability of Carbon-Based Life as a Function of the Light Quark Mass,” Epelbaum and others state that the “formation of carbon and oxygen in our Universe would survive a change” of about 2% in the quark mass or about 2% in the fine-structure constant, but that “beyond such relatively small changes, the anthropic principle appears necessary at this time to explain the observed reaction rate of the triple-alpha process.” This is a sensitivity more than 10,000 times greater than Adams estimates. It's a case that we can call very precise fine-tuning. On page 140, Adams gives estimates of the biological sensitivity of the quark masses, but they ignore the consideration just mentioned, and under-estimates the sensitivity of these parameters.
Problem
#5: The misleading table that tries to make radio-fine tuning
seem more precise than examples of cosmic fine-tuning.
On page 140 Adams give us the table below:
The
range listed in the third column represents what Adams thinks is the maximum multiplier that could be applied to these parameters
without ruling out life in our universe. One problem is that some of the
ranges listed are way too large, first because Adams is frequently
being over-generous in estimating by how much such things could vary
without worsening our universe's physical habitability (for reasons I
previously discussed), and second because Adams is using the wrong
rule for judging fine-tuning, considering “universes that allow life” when
he should be considering “universes as habitable and physically
favorable as ours.”
Another
problem is that the arrangement of the table suggests that the
parameters discussed are much less fine-tuned than a radio that is
set to just the right radio station, but most of the fundamental
constants in the table are actually far more fine-tuned than such a
radio. To clarify this matter, we must consider the matter of
possibility spaces in these cases. A possibility space is the range
of possible values that a parameter might have. One example of a
possibility space is the possible ages of humans, which is between 0
and about 120. For an AM radio the possibility space is between 535 and
1605 kilohertz.
What
are the possibility spaces for the fundamental constants? For the
constants involving one of the four fundamental forces (the
gravitational constant, the fine-structure constant, the weak
coupling constant and the strong coupling constant), we know that the
four fundamental forces differ by about 40 orders of magnitude in
their strength. The strong nuclear force is about
10,000,000,000,000,000,000,000,000,000,000,000,000,000 times stronger than
the gravitational force. So a reasonable estimate of the possibility
space for each of these constants is to assume that any one of them
might have had a value 1040 times smaller or weaker than
the actual value of the constant.
So
the possibility space involving the four fundamental coupling constants is something
like 1,000,000,000,000,000,000,000,000,000,000,000,000 larger than
the possibility space involving an AM radio. So, for example, even
if the strong coupling constant could have varied by 1000 times as
Adam claims, and still have allowed for life, for it to have such a
value would be a case of fine-tuning more than 1,000,000,000,000
times greater than an AM radio that is randomly set on just the
frequency. For the range of values between .001 and 1000 times the
actual value of the strong nuclear force is just the tiniest fraction
within a possibility space in which the strong nuclear force can vary
by 1040 times. It's the same situation for the
gravitational constant and the fine-structure constant (involving the
electromagnetic force). Even if we go by Adam's severe
under-estimations of the biological sensitivity of these constants,
and use the estimates he has made, it is a still a situation that
this is fine tuning trillions of times more unlikely to occur by
chance than a radio being set on just the right station by chance,
because of the gigantic probability space in which fundamental forces
might vary by 40 orders of magnitude.
A similar situation exists in regard to what Adams calls on page 140 the vacuum energy scale. This refers to the density of energy in ordinary outer space such as interstellar space. This is believed to be extremely small but nonzero. Adams estimates that it could have been 10 orders of magnitude larger without preventing our universe's habitability. But physicists know of very strong reasons for thinking that this density should actually be 1060 times or 10120 times greater than it is (it has to do with all of the virtual particles that quantum field theory says a vacuum should be packed with). So for the value of the vacuum energy density to be as low as it is would seem to require a coincidence with a likelihood of less than 1 in 1050 . Similarly, if a random number generator is programmed to pick a random number between 1 and 1060 with an equal probability of any number between those two numbers being the random number, there is only a microscopic chance of the number being between 1 in 1050 .
A similar situation exists in regard to what Adams calls on page 140 the vacuum energy scale. This refers to the density of energy in ordinary outer space such as interstellar space. This is believed to be extremely small but nonzero. Adams estimates that it could have been 10 orders of magnitude larger without preventing our universe's habitability. But physicists know of very strong reasons for thinking that this density should actually be 1060 times or 10120 times greater than it is (it has to do with all of the virtual particles that quantum field theory says a vacuum should be packed with). So for the value of the vacuum energy density to be as low as it is would seem to require a coincidence with a likelihood of less than 1 in 1050 . Similarly, if a random number generator is programmed to pick a random number between 1 and 1060 with an equal probability of any number between those two numbers being the random number, there is only a microscopic chance of the number being between 1 in 1050 .
Adam's chart has been cleverly arranged to give us the impression that the fundamental constants are less fine-tuned than a radio set on the right station, but the opposite is true. The main fundamental constants are trillions of times more fine-tuned than a radio set on the right station. The reason is partially because the possibility space involving such constants is a more than a billion quadrillion times larger than the small possibility space involving what station a radio might be tuned to.
Problem
#6: Omitting the best cases from his summary table.
Another huge shortcoming of Adams' paper is that he has omitted some of the biggest cases of cosmic fine-tuning from his summary table on page 140. One of the biggest cases of fine-tuning involves the universe's initial expansion rate. Scientists say that at the very beginning of the Big Bang, the universe's expansion rate was fine-tuned to more than 1 part in 1050 , so that the universe's density was very precisely equal to what is called the critical density. If the expansion rate had not been so precisely fine-tuned, galaxies would never have formed. Adams admits this on page 40-41 of his paper. There is an unproven theory designed to explain away this fine-tuning, in the sense of imagining some other circumstances that might have explained it. But regardless of that, such a case of fine-tuning should be included in any summary table listing the universe's fine-tuning (particularly since the theory designed to explain away the fine-tuning of the universe's expansion rate, called the cosmic inflation theory, is a theory that has many fine-tuning requirements of its own, and does not result in an actual reduction of the universe's overall fine-tuning even if the theory were true). So why do we not see this case in Adams' summary table entitled "Range of Parameter Values for Habitable Universe"?
Adams' summary table also makes no mention of the fine-tuning involving the Higgs mass or the Higgs boson, what is called "the hierarchy problem." This is a case of fine-tuning that so bothered particle physicists that many of them spent decades creating speculative theories such as supersymmetry designed to explain away this fine-tuning, which they sometimes said was so precise it was like a pencil balanced on its head. Referring to this matter, this paper says, "in order to get the required low Higgs mass, the bare mass must be fine-tuned to dozens of significant places." This is clearly one of the biggest cases of cosmic fine-tuning, but Adams has conveniently omitted it from his summary table.
Then there is the case of the universe's initial entropy, another case of very precise fine-tuning that Adams has also ignored in his summary table. Cosmologists such as Roger Penrose have stated that for the universe to have the relatively low entropy it now has, the entropy at the time of the Big Bang must have been fantastically small, completely at odds from what we would expect by chance. Only universes starting out in an incredibly low entropy state can end up forming galaxies and yielding life. As I discuss here, in a recent book Penrose suggested that the initial entropy conditions were so improbable that it would be more likely that the Earth and all of its organisms would have suddenly formed from a chance collision of particles from outer space. This gigantic case of very precise cosmic fine-tuning is not mentioned on Adams' summary table.
Then there is the case of the most precise fine-tuning known locally in nature, the precise equality of the absolute value of the proton value and the absolute value of the electron charge. Each proton in the universe has a mass 1836 times greater than the mass of each electron. From this fact, you might guess that the electric charge on each proton is much greater than the electric charge on each electron. But instead the absolute value of the electric charge on each proton is very precisely the same as the absolute value of the electric charge on each electron (absolute value means the value not considering the sign that is positive for protons and negative for electrons). A scientific experimental study determined that the absolute value of the proton charge differs by less than one part in 1,000,000,000,000,000,000 from the absolute value of the electron charge.
This is a coincidence we would expect to find in fewer than 1 in 1,000,000,000,000,000,000 random universes, and it is a case of extremely precise fine-tuning that is absolutely necessary for our existence. Since the electromagnetic force (one of the four fundamental forces) is roughly 10 to the thirty-seventh power times stronger than the force of gravity that holds planets and stars together, a very slight difference between the absolute value of the proton charge and the absolute value of the electron charge would create an electrical imbalance that would prevent stars and planets from holding together by gravity (as discussed here). Similarly, a slight difference between the absolute value of the proton charge and the absolute value of the electron charge would prevent organic chemistry in a hundred different ways. Why has Adams failed to mention in his summary table (or anywhere in his paper) so precise a case of biologically necessary fine-tuning?
Clearly, Adams has left out from his summary table most of the best cases of cosmic fine-tuning. His table is like some table entitled "Famous Yankee Hitters" designed to make us think that the New York Yankees haven't had very good hitters, a table that conveniently omits the cases of Babe Ruth, Lou Gehrig, Joe DiMaggio, and Micky Mantle.
Below is a table that will serve as a corrective for Adams' misleading table. I will list some fundamental constants or parameters in the first column. The second column gives a rough idea of the size of the possibility space regarding the particular item in the first column. The third column tells us whether the constant, parameter or situation is more unlikely than 1 chance in 10,000 to be in the right range, purely by chance. The fourth column tells us whether the constant, parameter or situation is more unlikely than 1 chance in a billion to be in the right range, purely by chance. For the cosmic parameters "in the right range" means "as suitable for long-lasting intelligent life as the item is in our universe." The "Yes" answers follow from various sensitivity estimates in this post, in Adams' paper, and in the existing literature on this topic (which includes these items). For simplicity I'll skip several items of cosmic fine tuning such as those involving quark masses and the electron/proton mass ratio.
Parameter or Constant or Situation | Size of Possibility Space | More Unlikely Than 1 in 10,000 for Item to Be in Right Range, by Chance? | More Unlikely Than 1 in 1,000,000,000 for Item to Be in Right Range, by Chance? |
Strong nuclear coupling constant | 1040 (difference between weakest fundamental force and strongest one) | Yes | Yes |
Gravitational coupling constant | 1040 (difference between weakest fundamental force and strongest one) | Yes | Yes |
Electromagnetic coupling constant | 1040 (difference between weakest fundamental force and strongest one) | Yes | Yes |
Ratio of absolute value of proton charge to absolute value of electron charge | .000000001 to 1000000000 | Yes | Yes |
Ratio of universe's initial density to critical density (related to initial expansion rate) | 1040 to 1/ 1040 | Yes | Yes |
Initial cosmic entropy level | Between 0 and some incredibly large number | Yes | Yes |
Vacuum energy density | Between 0 and 1060 or 10120 its current value, as suggested by quantum field theory | Yes | Yes |
AM radio tuned to your favorite AM station | 535-1605 kilohertz | No | No |
FM radio tuned to your favorite FM station | 88-108 megahertz | No | No |
The truth is that each of these cases of cosmic fine-tuning is trillions of time more unlikely to have occurred by chance than a random radio being exactly tuned to your favorite AM station or FM station. To imagine the overall likelihood of all of these cases of fine-tuning happening accidentally, we might imagine an archer who successfully hits more than 7 archery targets randomly positioned around the circumference of a 180-meter circle, with the archer being at the center of the circle and blindfolded.
Postscript: In June 2019 some scientists published a paper entitled "An update on fine-tunings in the triple alpha process." They end by stating "a relatively small ~0.5% shift" in the light quark mass "would eliminate carbon-oxygen based life from the universe." This is a case of very precise fine-tuning, contrary to the claims of Adams, who tries to make it look like a parameter that could have differed by three orders of magnitude (a factor of about 1000 times). The scientists also state that "such life could possibly persist up to ~7.5% shifts" in the electromagnetic coupling constant (equivalent to the fine structure constant). This is also a case of precise fine-tuning, contrary to the claims of Adams, who tries to make the fine structure constant look like a parameter that could have differed by four orders of magnitude (a factor of about 10,000 times).
In his table on page 140 Adams has told us that the electron/proton mass ratio isn't very sensitive to changes. But in his book The Particle at the End of the Universe, page 145 to 146, physicist Sean Carroll states the following:
“The size of atoms...is determined by...the mass of the electron. If that mass were less, atoms would be a lot larger. .. If the mass of the electron changed just a little bit, we would have things like 'molecules' and 'chemistry', but the specific rules that we know in the real world would change in important ways...Complicated molecules like DNA or proteins or living cells would be messed up beyond repair. To bring it home: Change the mass of the electron just a little bit, and all life would instantly end.”
No comments:
Post a Comment