The
popular website Slate.com just published a long article about
research into paranormal phenomena. The piece by Daniel Engber is an example of the unfair and misleading treatment this topic gets
in the mainstream press.
The
article is entitled “Daryl Bem Proved ESP Is Real” and is
subtitled “Which Means Science Is Broken.” It's a kind of a
“trojan horse” title, because the article was clearly written to
try to debunk Bem's research. The article discusses experimental
research by Cornell University emeritus professor Daryl Bem. The research was
published in a peer-reviewed scientific publication, the Journal
of Personality and Social Psychology. The widely discussed paper
was entitled, “Feeling the Future: Experimental Evidence for Anomalous Retroactive Influences on Cognition and Affect.”
In
Experiment 1 described in the paper, subjects sat in front of a
computer screen that displayed two images of a screen. The 100
subjects were told behind one of the screen was an image, and behind
the other screen was nothing. The subjects were asked to guess which
screen had the image behind it, during a series of trials running 20
minutes. When an erotic picture was used as the image behind the
screen, subjects were able to guess correctly somewhat more often
which screen had the image behind it. With erotic pictures, they
guessed correctly 53% of the time, much more than the 50% expected by
chance. With pictures that were not erotic, the subjects got results
very close to the result expected by chance, 49.8%. Other similar experiments reported in the paper also got more statistically significant results.
Schematic depiction of ESP
Skeptics were outraged by these results, claiming they would never be replicated. But they were replicated. The meta-analysis here discusses many successful replications of Bem's surprising results. The meta-analysis here ("Feeling the future: A meta-analysis of 90 experiments on the anomalous anticipation of random future events") discusses 90 experiments from 33 laboratories in 14 different countries. The analysis reported an overall effect of p=1.2 X 10-10. Roughly speaking, this means the results had a probability of about 1 in 10 billion. This is a very impressive result, showing statistical significance millions of times stronger than what is shown in typical papers reported by mainstream media. A typical paper that gets covered by the press will have an effect of only about p=.01 or p=.05.
In his article at
slate.com mainstream media writer Daniel Engber clearly attempts to
disparage and belittle this research finding. He uses some common
techniques used by those denying or belittling evidence for the
paranormal.
The
first technique is to deprive the reader of the most relevant
information the reader needs to decide on this manner. Engber makes
no mention of the extremely compelling effect of p=1.2 X
10-10 reported in Bem's analysis. And so it almost always
is when mainstream media reports on ESP experiments – typically a
writer will report that a researcher reported “statistically
significant” results without telling us that the degree of
statistical significance reported is many times greater than
what we find in typical scientific papers (which typically have
statistical significance of only about p=0.5 or p=0.01).
The
second technique Engber uses is a kind of isolation technique. His
long article about ESP research gives no summary of the long history
of ESP research dating back to the nineteenth century. The reader is
almost kind of left with the impression that Daryl Bem's research was
some weird fluke like a green sky appearing one day. But that's the
opposite of the real situation. Scientists have been producing
experiments showing very convincing evidence for ESP and other
inexplicable human abilities for more than 100 years. The research
started being done systematically about the time the Society for
Psychical Research was founded in the late nineteenth century. One
high point of the experimental research were the results of Joseph
Rhine at Duke University during the 1930's, which actually
produced results far more statistically significant than Bem's
research. More recently, very compelling results for ESP were very
often produced using a sensory deprivation technique called the
ganzfeld technique. At the 2014 Parapsychological Assocation
meeting, Diane Hennacy Powell MD presented extremely compelling evidence
for ESP in an autistic child. Engber doesn't mention any of these
things (see the table at the end of this post for the specifics).
Describing
a moment in the past, perhaps about 1980, Engber tells us, very
inaccurately, that “the
laboratory evidence for ESP had begun to shrivel under careful
scrutiny.” That is not at all correct. Classic ESP research such
as Joseph Rhine's has never been successfully debunked. Before
making this claim, Engber cites an example of what he apparently
thinks is something that had discredited ESP research. It is the
fact that James Randi hired associates to deceive some paranormal
researchers, by acting as “fake psychics.” But while that
incident reflected badly on Randi, it did nothing to discredit
paranormal researchers, since they weren't the ones who were doing
the faking.
Engber
also throws in some “poisoning the well” rhetoric. He calls
Bem's research results “crazy-land,” which is just vacuous
disparagement. I may note that some of the most respected research
results in scientific history (such as the double-slit experiments) were
originally regarded as “crazy” results. Engber states:
Daryl
Bem had seemed to prove that time can flow in two directions—that
ESP is real. If you bought into those results, you’d be admitting
that much of what you understood about the universe was wrong.
No,
experimental results such as Bem's do not demand that we believe that
“time can flow in two directions.” And it also is not true that
results such as Bem's require people to believe that much of what
they understood about the universe is wrong (although it is true that
such research may suggest some people making overly dogmatic
assumptions about the nature of time, consciousness and matter might
need to reassess their assumptions, and admit their ignorance about
such eternal questions).
Engber
then goes to a long 8-paragraph discussion of a 2011 scientific paper
by Simmons, Nelson, and Simonsohn. It's a paper entitled,
“False-Positive Psychology: Undisclosed Flexibility in Data
Collection and Analysis Allows Presenting Anything as Significant.”
Engber refers to this as the “When I'm Sixty Four” paper. Engber
then delivers this very inaccurate statement:
But
Simmons, Nelson, and Simonsohn revealed that Bem’s ESP paper was
not a matter of poor judgment—or not merely that—but one of
flawed mechanics....They’d shown that anyone could be a Daryl Bem,
and any study could end up as a smoking pile of debris.
This
statement by Engber is hogwash and baloney. In fact, the 2011 paper
by Simmons, Nelson, and Simonsohn made no reference at all to Bem's
research and made no reference to ESP or any research on paranormal
phenomena. Their paper showed how someone might through dubious
methods produce a borderline statistical significance along the lines
of p=.05 or p= .01. But the paper did nothing to suggest that such
dubious methods were used by Bem or any other ESP researcher. In
fact, the level of significance claimed by Bem in his meta-analysis
is p=1.2 X 10-10. That's a level of significance a
hundred million times greater than merely p= .01.
Engber
has made a totally inaccurate claim that Simmons, Nelson, and
Simonsohn debunked Bem's research, when their paper doesn't even
mention Bem or ESP research. What
he also fails to tell us is that if Simmons, Nelson, and Simonsohn's
paper can be claimed to debunk anything, then it debunks all
experimental research claiming results with a significance of around
p= .01 or p=.05 – which is basically a large fraction of all
research published in neuroscience, medicine, psychology, and
physics.
For
Engber to claim some great significance for the paper of Simmons,
Nelson, and Simonsohn is rather absurd, since the paper basically
tells us that we can get false alarms when the results have a
significance of around p= .01 or p=.05 – which is something
everyone already knew before the paper was written.
Engber's
next trick is to start talking about replication failures and
research fraud. He says this:
A
few months later came the revelation that a classic finding in the
field of social priming had failed to replicate. Soon after that, it
was revealed that the prominent social psychologist Diederik Stapel
had engaged in rampant fraud. Further replication failures and new
examples of research fraud continued to accumulate through the
following year.
Clearly,
Engber is trying to bring into his reader's mind the idea of ESP
research being fraudulent. But, in fact, all of the examples cited
above refer to work that was being done by researchers working on
topics other than ESP or paranormal phenomena. Mentioning such
things in the middle of an article on ESP research is not literally
inaccurate, but it's extremely misleading, something very much prone
to create a false idea in the reader's mind. In fact, ESP researchers
have an excellent record of honesty, as good as that of any group of
scientific experimenters. Engber does not cite any example of
dishonesty by an ESP researcher – he simply deviously leaves his
readers with an impression of such a thing.
I
can imagine a writer writing in an equally misleading manner, giving
some web links that do not actually refer to candidate John Doe, but
to some other people. It might go like this:
Doubts
have been raised about the candidacy of John Doe. There was a drunk
driving arrest (Link 1). Then there was a bank robbery arrest (Link
2). Then there was a grand jury indictment (Link 3). Clearly we must
question John Doe's fitness for office.
Of
course, if none of these links referred to John Doe, but referred to
other people, such a paragraph would be very misleading.
Towards
the end of his piece, Engber tries to throw doubt on Bem's findings
by referring to a replication attempt to repeat one of the several
experiments in his original “Feeling the Future” paper. Engber
claims that at the most recent meeting of the Parapsychological
Association (which at this date would have been the 2016 meeting),
Bem presented a “pre-registered analysis” which “showed no
evidence at all for ESP.” Engber claims that nonetheless Bem's
abstract of this work, after “adding in a new set of statistical
tests,” stated that the replication attempt had produced “highly
significant” evidence for ESP. Clearly Engber is trying to suggest
that maybe some kind of statistical funny business was going on. But
he presents no specific facts to back up such a claim, nor does he
give a link that would allow us to check out his insinuations. When I
go to the web page that lists the abstracts submitted to the 2016
meeting of the Parapsychological Association, I see no abstracts
authored by Bem. The experiment referred to is not even the main
experiment by Bem that received the most attention in the press (the
experiment I described above). All in all, this does nothing to
raise doubts about Bem, but may raise further questions about Engber's
hatchet tactics when dealing with the paranormal.
In
this case Bem is an ace Ivy League psychologist with several decades
of statistical research experience, and Engber is neither a
mathematician nor a scientist. So if Engber is going to raise doubts
on statistical grounds, no one should pay attention to him unless he
is very specific in what his precise objections are. A little vague
rhetorical doubt-sprinkling does not suffice. In his long article,
Engber does not actually provide a single specific well-documented statistical or
methodological reason for doubting Bem's research.
Engber's
misleading article is typical of the dismal coverage that ESP
research gets in mainstream media. A great deal of this coverage is
inaccurate. Astonishingly, it has become “politically correct”
within today's science culture to make completely false statements
about research into the paranormal. It is extremely common for
scientists to claim there is no evidence for extrasensory perception,
which is entirely false, given the very large body of convincing
experimental evidence that has been gathered over more than 100
years, much of it under the auspices of major universities or the US
government (see the table below for examples). It is also very
common for scientists to say that the experimental evidence for ESP
has been debunked or that it was never replicated. Neither of these
statements is true. Shockingly, we have a science culture in which highly inaccurate
statements on parapsychology research are pretty much the norm. It's
rather like the situation we would have if it become popular in
American history departments for professors to say that the Americans
won the Vietnam War, and that they treated the Vietnamese very nicely
while doing it.
Engber's
article is entitled “Daryl Bem Proved ESP Is Real” and is
subtitled “Which Means Science Is Broken.” A more accurate title
for an article would be, “Joseph Rhine and Many Other Researchers
Showed ESP Is Very Probably Real, But Science Culture Refused to
Accept It, Which Shows Science Culture Is Broken.”
Bem's
research was purely experimental. There is an entirely separate
reason for thinking that humans can sometimes sense the future in a
paranormal way: the large body of episodic accounts supporting such a
claim. I will discuss this fascinating evidence in my next post.
Below is a table showing some high points of research into ESP.
Below is a table showing some high points of research into ESP.
Researcher | Procedure | Results | Link |
Professor Bernard F. Riess, Hunter College, 1937 | Remote card guessing of 1850 cards with woman in another building | 73% accuracy rate with expected accuracy rate of 20% | http://futureandcosmos.blogspot.com/2016/02/better-than-smoking-gun-riess-esp-test.html |
Professor Joseph Rhine and others, Duke University, 1932 | Card guessing experiments with Hubert Pearce, 10300 cards, experimenter and subject in same room | 36% accuracy rate with expected accuracy of 20% | http://futureandcosmos.blogspot.com/2014/12/when-rhine-and-pearce-got-smoking-gun.html |
J.G. Pratt, Duke University, 1933-1934 | Card guessing experiments with Hubert Pearce, 1850 cards, experimenter and subject in different rooms | 30% accuracy rate with expected accuracy of 20% | http://psychicinvestigator.com/demo/ESPdoc.htm |
Ganzfeld ESP tests, 1997-2008 | ESP tests under sensory deprivation, various subjects, 1498 trials | 32% accuracy rate, with expected accuracy of 25% | http://www.deanradin.com/FOC2014/Storm2010MetaFreeResp.pdf |
Rupert Sheldrake, PhD, 2014 | 63 subjects, 570 trials, test of whether subject could correctly guess a phone caller | 40% accuracy rate with expected accuracy of 25% | http://www.sheldrake.org/files/pdfs/papers/ISLIS_Vol32.pdf |
Rupert Sheldrake, PhD, 2014 | 50 subjects, 552 trials, test of whether subject could correctly guess who sent an e-mail | 43% accuracy rate with expected accuracy of 25% | http://www.sheldrake.org/files/pdfs/papers/ISLIS_Vol32.pdf |
Diane Hennacy Powell MD, 2014, | ESP tests with an autistic child | 100% accuracy on three out of twenty image descriptions containing up to nine letters each, 60 to 100% accuracy on all three of the five-letter nonsense words, and 100% accuracy on two random numbers: one eight digits and the other nine. Data from the second session with Therapist A includes 100% accuracy on six out of twelve equations with 15 to 19 digits each, 100% accuracy on seven out of 20 image descriptions containing up to six letters, and between 81 to 100% accuracy on sentences of between 18 and 35 letters. Data from the session with Therapist B showed 100% accuracy with five out of twenty random numbers up to six digits in length, and 100% accuracy with five out of twelve image descriptions containing up to six letters. | http://dianehennacypowell.com/evidence-telepathy-nonverbal-autistic-child/ |
No comments:
Post a Comment