QUANTUM MECHANICS IS NOT SCIENCE

Trevor W. Marshall


Physics is where Science began....

....and most physicists continue that proud tradition. They do so in spite of twentieth-century Physics being rotten at the core. They persist in trying to find causal explanations for natural phenomena, yet the overarching theory of twentieth-century Physics, known as Quantum Mechanics (I shall call it QM), not only claims that causal explanations are a delusion, but also trains our cleverest young brains to produce magical explanations a witch doctor would be proud of.

What is a causal explanation?

It's one which respects the direction of time. No self-respecting biologist or psychologist would try to tell us that something which happens today is caused by something that is going to happen tomorrow. Of course, if we think we can see what might happen tomorrow we might act today in an effort to change it, but that's a different matter.

And QM explains things differently?

The short answer is Yes. Indeed many distinguished exponents of QM tell us that it isn't the job of Science to explain anything at all.

So the Quantum has undermined Science?

No. The problem with QM isn't the Q; it's the M. Its predecessor was called Classical Mechanics, and that wasn't causal either, because it tried to explain the action of one object on another distant one by means of an instantaneous force. That was how the Sun was supposed to act on the Earth. We call it Action at a Distance.

But I thought the alternative to QM was a return to classical determinism....?

Well that's one alternative, but it's a pretty poor one! Classical mechanics had already had its day by the beginning of the nineteenth century, when the wave theory of light supplanted Newton's corpuscular theory. Then Faraday and Maxwell showed that electromagnetic interactions between charged bodies were carried by a physical field, and finally, in 1915, Einstein showed that gravity also propagates, as a physical field, from the Sun to the Earth.

So causality means fields?

Yes, and also, as Einstein insisted in a famous article in 1948 (see also the EPR article of 1935 and Bell's inequality article of 1964), that in turn requires something called locality...

But didn't the Aspect experiment show Einstein was wrong?

No, it didn't. The celebrated experiment of that name was done in 1981 and was a refinement of what Clauser and Freedman did in 1972. Neither of them established that locality is violated. If you think about it, it's just as well they didn't. Because if two "photons" with a common origin were really joined by an umbilical chord, so that an observation of one of them produced a physical change in the other, then you would have to start believing all sorts of other telepathic and psychokinetic mumbo-jumbo.

Surely you aren't suggesting....?

That Aspect, Clauser, Freedman etc believe in mumbo-jumbo? No, I am sure they would all emphasize the difference between the apparently capricious microworld and our own macroworld. But the trouble is that two light detectors 5 metres apart look uncomfortably like a macroscopic system. And now there are all sorts of exotic applications of their ideas, such as quantum computers and quantum cryptography, which are being taken very seriously...

And attracting a lot of research funds?

Yes, but that's no guarantee that they'll work. Remember how much research funds Artificial Intelligence gobbled up. AI is a dirty word now!

So what is locality?

It's what you get when you combine rejection of Action at a Distance with Einstein's Relativity. It turns out that a consistently field view of interactions means that any field must propagate at a speed not greater than that of light.

And don't the supporters of QM believe in fields?

Yes, most of them. But I am afraid they have also allowed their brains to be confused by something called Wave-Particle Duality. Light is continuous, but every time you make an observation, with a photodetector, it is supposed to collapse into a pointlike object called a photon.

Can you have photons and locality?

It is possible to invent all kinds of corpuscular theories, and indeed all kinds of undulatory theories. No doubt some of them would satisfy Einstein locality, but Science doesn't operate like that. We can consider only precisely formulated photon theories, and there is no doubt at all that the collapse involved in the QM theory of measurement requires nonlocality.

So how do we get out of this mess?

We have to find something better. But let's be clear what that means. As a theory of what actually occurs in the measurement process, QM is useless. Indeed, the only defence offered by its supporters is that the measurement can not be analysed by any conceivable theory. But as an algorithm for predicting the outcome of an enormous body of spectroscopic and scattering experiments, QM is astoundingly successful. A better theory would reproduce this computational success - and at the same time restore locality.

That sounds impossible

Well it is pretty demanding, but we just saw that there is a historical parallel. Newton's theory of gravitation lasted over 200years, and was an outstanding algorithmic success. Yet it failed Einstein's locality criterion just as spectacularly as does QM. And, throughout those two centuries, there were critics of Newton's theory. Though they did not have the benefit of Einstein's precise formulation, they knew not only that Newton's theory would have to be replaced, but also which feature of that theory needed correcting. They were vindicated in 1915.

Does all that help us at all though?

Not a lot! All the evidence from the history of Physics indicates that you won't get anywhere in constructing a local alternative to QM unless you already possess a fairly formidable background in the subject. That is a necessary, but by no means sufficient condition; we have seen that that very same formidable background may cause you to be blinded by your own algorithmic cleverness. Of course, if you search the Web a bit you'll find a fair number of experts who will impress you both with their formidable background and with their sincere faith that they have found the solution, or at least a large part of it. Actually I am one of them, so read some of my articles if you would like to know more. If you don't find me one hundred percent convincing, then you might try confronting such other experts as you come across with some of the above questions.

Can we get any new science?

Well, challenging the underlying philosophy of contemporary science, which is what I am doing, is not necessarily the quickest way to get new science, but it may get us to better new science. I am myself convinced that the day my scientific colleagues drop what I consider the obsolete concept of the photon their science will improve. Recently (September, 1997) I have discovered a situation where they expect photons to be created in pairs, one red and one blue, whereas I expect about 1.03 red photons for every blue one. This leads to a very simple experimental test. Full details are in my article The party may be over, but some are also given on this page under Parametric Down Conversion

A technical note

This is written for the benefit of any fellow "experts" who may have stumbled into this page. A good proportion of you will know that I and my colleagues have shown that the so-called "enhancement loophole" in the atomic-cascade experiments permits local realist explanations of their results. This information has been published by us in all the best journals, but has been largely ignored.

As I indicated in my response to the last question, progress in finding an alternative theory has been painfully slow. Our programme is based on Stochastic Electrodynamics. This sets out to show that the electromagnetic field may be described by the unquantized Maxwell equations - in other words there are no photons. The data which most people accept as evidence for photons is better interpreted as evidence for a real zeropoint, or vacuum electromagnetic field. We have recently been able to extend this programme to cover a very wide range of experimental data in the area of Parametric Down Conversion, and have natural explanations for a number of otherwise incomprehensible experiments there.

The biggest challenge is in Quantum Electrodynamics (QED). Here we have been able to show that the current is another stochastic field; like the electromagnetic field it has vacuum fluctuations, and the processes of QED, for example the Compton effect, may be understood as the interaction of two "turbulent fields". Here I am using the terminology of Freeman Dyson (Proc. Roy. Soc. A, 207, 395-401 (1951)). This inspired insight of Dyson owed much to the influence of Julian Schwinger, and before him Victor Weisskopf. All of these men clearly pictured the electron as an extended object of size exceeding the Compton wavelength. Their insight was unfortunately outflanked by the technical brilliance of Richard Feynman, who, like Paul Dirac, insisted on the pointlike electron, and it is Feynman's ideas which continue to dominate us. My thesis is that the terrible illness of nonlocality is the price we have paid for accepting pointlike photons and electrons, and that Stochastic Electrodynamics, which is the full maturing of Dyson's insight, will restore locality.

For further remarks on the more general humanist implications of my argument, see my title page Einstein the enigma.

Return to Top of Page

trevor@ ma.man.ac.uk