Today is the day in
which many people are engaging in a series of protests called March
for Science. No doubt there will be many speeches whipping up
enthusiasm for science and scientists. Such applause is largely
deserved; certainly most of our scientists are fine people who do
excellent work. But there are some things we should remember on a
day in which people might be prone to start chanting “Trust the
scientists!” One of these is that quite a few scientists get things
wrong or fail to act in the public interest.
Some
of these cases are detailed in a recent book by Paul A. Offit
entitled Pandora's
Lab: Seven Stories of Science Gone Wrong. We
hear of cases such as chemist Frtiz Haber, whose contributions were a
weird checkerboard of good and evil. He pioneered chemical processes
that helped to feed millions, but also worked vigorously to help the Germans develop poison gas. Then there was the
large number of scientists in the US who supported eugenics,
encouraging sterilization of those with “inferior genes.” They
claimed this was just a consequence of good Darwinist thinking. The
climax of eugenics was the perverse work of the Nazis, who sent
millions to gas chambers, claiming that this was “applied science”
designed to achieve “racial sanitation.” Then there was
neurologist Egas Moniz, who was given the Nobel Prize for developing
the surgical brain procedure called lobotomy. Some 40,000 people in
the US underwent this procedure, which is now regarded as a horror
and a blunder.
More recently
there's the case of a Massachusetts chemist who worked at a drug
testing lab. Her misconduct over eight years (including falsifying
evidence) was so bad that prosecutors announced that they would throw
out 21,587 drug convictions tainted by her involvement. Another drug
test chemist in western Massachusetts was involved in equally
egregious misconduct (regularly smoking crack on the job), casting doubt on countless drug convictions that
she was involved with.
But the biggest case
of “science gone wrong” may have occurred in the 1940's. That's
when our scientists got involved in designing bombs that have put the
survival of mankind at risk for 70 years.
By the time the
first atomic bomb was ready for testing in July 1945, it was no longer
really needed. By this time the Nazis had already been defeated, the
Japanese Navy had been crushed, and the United States had established
bomber bases that were being used to mercilessly pound Japanese
cities with conventional bombs. The cumulative destruction of such
bombing equaled the destruction of a nuclear bomb, with 88,000 dying
from a single raid on Tokyo. But nonetheless scientists gave the
green light for testing the first atomic bomb, even though there was
some doubt as to whether the test would cause a chain reaction in the
atmosphere that would destroy all life. On the day of the testing,
leading nuclear expert Enrico Fermi suggested a bet as to whether the
entire atmosphere of Earth would be set on fire.
Our
scientists should have said in July 1945: we
refuse to test this thing – keep this genie in the bottle.
But the development of atomic weapons went forward. Gradually it
became apparent: there was a way to make a far more destructive bomb,
by using nuclear fusion rather than nuclear fission. This type of
bomb was the hydrogen bomb, developed in the very late 1940's and
early 1950's.
There
was never any military necessity for a hydrogen bomb. Both the United
States and the Soviet Union already had atomic bombs capable of
destroying the cities of the other side. When ordered to invent a
hydrogen bomb, our scientists should have said: no,
sir, we won't do that.
But they meekly obeyed orders, and before long both sides were armed
to the teeth with hydrogen bombs.
For about 70 years,
nuclear weapons have been like a gun pointed to the head of mankind.
And it was Big Science that has pointed that gun at mankind's head. The long entanglement between science and militarism is troubling. The average person will not be able to recall the name of a single scientist who refused to build a weapon.
Nowadays we have
non-stop news channels and the Internet for news. But back in the
1960's you got breaking news through “Special Announcements” that
interrupted your television program. Such announcements would only
come on your TV sets a few times a year, and came at moments such as
the assassination of Martin Luther King. You would always see an
ominous vague screen saying something like “Special Announcement,”
and you would then about 10 seconds later hear an announcer who
would tell you what the breaking news was. During those 10 seconds,
you would be filled with dread, and ask yourself: has a nuclear war
started?
Then
there were the Emergency Broadcast System test screens, which would
pop up on TV sets every few weeks. The system was designed to give
alerts in the event of a nuclear war. The alert screen would pop up
on your TV, and the announcement would end by saying, “This is just
a test.” But so often when you saw the alert sign on your TV, you
would hold your breath, and ask yourself: is
this the real thing? Has a nuclear war begun?
During the 1950's,
1960's, 1970's and 1980's the threat of nuclear destruction was a
dark cloud over the heads of everyone living through those decades.
When the Cold War ended with the breakup of the Soviet Union, lots of
people seemed to say, “Thank God that risk is all over.” But the
risk has persisted to this day, because both the US and Russia are
still each armed with about 7000 nuclear weapons.
Institutional
science encourages conformist thinking in which a scientist is
pressured to judge the rightness of something based on whether his
colleagues agree with it. Such conformist thinking is a disaster in
any case in which the majority has moved or is moving in the wrong
direction. So it's hardly surprising that our nuclear scientists
followed the herd as it moved towards the brink of atomic
destruction. What else should we expect when “follow your peers”
is the predominant rule? And we should also not have expected
psychologists who became involved with “enhanced interrogations”
torture to have refused to cooperate.
It would be better
if institutional science encouraged an individual to reject the
majority's path or judgment whenever it fails to hold up to standards
of morality or plausibility. The judgment of an individual should
always be a check and balance against the path or opinions of a
majority, for scientific and governmental majorities have often gone
in the wrong direction, often because of sociological herd effects.
Science by itself is
morally neutral. Given the triple perils of environmental
degradation, global warming and resource depletion, we should by all
means struggle to reduce global warming, to reduce consumption, and
to increase clean, green energy. But making such efforts comes under
the category of personal ethics and sound public policy, not science.
Just as science didn't tell our nuclear scientists not to build
unnecessary hydrogen bombs that put the human race in peril,
science by itself doesn't tell us whether we should live it up today
in disgusting excess or live in a more green and frugal manner to
protect future generations. We must find the answer to that question
by using some internal moral compass.
No comments:
Post a Comment