15

Wolfgang Demtröder writes this in his book on Experimental Physics,

The future destiny of a microparticle is no longer completely determined by its past. First of all, we only know its initial state (location and momentum) within limits set by the uncertainty relations. Furthermore, the final state of the system shows (even for accurate initial conditions) a probability distribution around a value predicted by classical physics.

If the quantum probabilistic distribution always lie near the classical prediction, why do we need quantum mechanics in the first place? According to the Feynman interpretation, if an electron has to go from A to B, it can take all the paths but the weight is more on the path predicted by classical mechanics. We know that it is unlikely that the electron travel through the mars to go from A to B on earth. Then, is not that path through mars is unnecessary? Should not in the spirit of Occam's razor, we exclude such thing in a theory?

Emilio Pisanty
  • 132,859
  • 33
  • 351
  • 666
derint
  • 185
  • 6
    The statement as is is false. Consider for example the Aspect experiment on Bell's inequality. The result of the experiment is away from any possible classical predictions by 2-3 sigma. – lcv Jun 02 '20 at 22:06
  • Dark matter: Our review suggests it's time to ditch it in favor of a new theory of gravity https://phys.org/news/2022-07-dark-ditch-favor-theory-gravity.html – Alex Jul 08 '22 at 17:54

10 Answers10

34

No. If the classical path was assumed to be the only path, there would be no quantum theory. It would just be classical. And clearly from the need for and success of a quantum theory that explains things outside the domain of the classical one, we know the world to be following quantum rules.

In Feynman’s highly readable QED he shows that assuming only the classical path fails to explain reflection from a glass slab. Experimentally the reflection depends on the thickness of the slab and he shows how it can be explained by the “all path” approach.

One needs to be aware of when it makes sense to use Occam’s razor. We can’t rule out a successful theory with a less successful one only because the less successful one is simpler. It must be used when choosing between things that have the same domain of validity. For instance, “the particle takes all paths” vs “the particle takes all paths and god exists.” Here both theories make the same testable predictions but one has an extra untestable factor. Occam’s razor says pick the simpler one.

  • 3
    Honestly, I think that your example is not a very good one from the perspective of science and philosophy. Occam's razor doesn't really say much about whether or not God exists. A better example would be two actual competing physical theories on the same topic which make the same predictions in all current experiments, for example dark matter vs. MOND as an explanation of velocity profiles of galaxies; both are capable of reproducing the data, but dark matter fits within the modern particle physics framework whereas MOND does not, thus most physicists believe dark matter exists. – Kai Jun 02 '20 at 20:03
  • 16
    @Kai Occam's razor does make sense as a choice for which explanation to use, i.e. the physicist "had no need of that hypothesis". As Faye noted (discussed in that link), the point is that God's intercession is an unnecessary ingredient in the explanation, not that the razor tells you to disbelieve in such intercession or God's existence. – J.G. Jun 02 '20 at 20:23
  • I completely agree with you on the part that QED is highly successful and it shines when classical mechanics fails. I had only doubts in my mind that if quantum mechanics MUST produce results that agree with classical mechanics in appropriate limit, why should we trouble the electron to take all the paths in the first place? Can there be a theory which works the opposite of the conventional Copenhagen interpretation= it starts with the classical result and agree with quantum mechanics in appropriate limit? – derint Jun 02 '20 at 20:24
  • 4
    @derint keep in mind that the "all possible paths" interpretation is fundamentally misleading. It emerges from the path integral formulation of quantum mechanics and fundamentally relies on a picture of "particles" which are have "trajectories". This is fundamentally at odds with modern quantum theory, which says that particles are only an approximately relevant concept in low-energy limits. Fundamentally the real degrees of freedom are quantum fields, not particles, and the picture of a particle taking all possible classical trajectories is, clearly, semi-classical and approximate at best. – Kai Jun 02 '20 at 21:15
  • @derint - you focus on that part of quantum mechanics that is "closely related" to classical mechanics ... But how do you explain say .. Tunnel effect in classical terms ? .. in classical terms it would be non existant .. what a huge luck that real world DOES rely on quantum mechanics on small scales – eagle275 Jun 03 '20 at 07:55
  • @Kai I think we can be more specific re: dark matter. If we look at galactic rotation curves, "WIMPs" and "MOND" are both quite simple explanations. If we include other observations like lensing effects (e.g. in the bullet cluster), these explanations become "WIMPs" vs "MOND + something else to explain the lensing"; MOND needs extra complications for the extra observations, WIMPs doesn't. Likewise "WIMPs" vs "MOND + something to explain lensing + something to explain galaxy distributions + something to explain CMB anisotropies + etc." shows WIMPs (or some other dark matter theory) are simpler – Warbo Jun 03 '20 at 09:44
  • @Warbo thanks for the extra info, indeed this is why Occam's razor applies to the comparison of WIMPs vs. MOND. – Kai Jun 03 '20 at 13:20
  • @Kai I agree, two competing theories would be better. For example Quantum mechanics predictions only compared to a physical model that makes the same predictions. Which theory would be better? – Bill Alsept Jun 03 '20 at 21:26
  • Can someone link the "readable QED"? – ScottishTapWater Jun 05 '20 at 13:34
  • @Kai, you are correct about the particle picture being an approximation, but the path integral is still fundamental in quantum field theory. It's the difference between imagining a particle taking all possible paths between two points and a field taking all possible evolutions between two field arrangements. – Luke Pritchett Jun 06 '20 at 19:24
  • @LukePritchett I agree with that (although the "space of all possible field configurations" is not necessarily well defined, and thus opens the path integral formalism up to questioning), but in any case in QFT it makes more sense to consider non-local contributions ("the particle travels to mars and back") since a field is not a localized object like a particle is. – Kai Jun 06 '20 at 23:09
16

First of all, It's not true that the weight of the classical path is the highest in Feynman propagator. It's the one whose contribution doesn't get cancelled out by other paths in the limit when the action is very large compared to $\hbar$.

In all other cases, the non-classical paths play a crucial role in the results of an experiment. Quantum mechanical predictions which deviate from classical predictions are absolutely measurable and are measured all the time. For example, see the double-slit experiment. Moreover, even if quantum mechanics were to adjust classical mechanics by just a little bit, it couldn't have been avoided using Occam's razor because it would still provide us newly accurate information that is not available from classical mechanics. Also, there are other fundamental reasons as well as to why quantum mechanics is unavoidable. For example, you cannot explain the stability of the atom in classical mechanics. Since quantum mechanics does explain this--something that classical mechanics cannot, Occam's razor doesn't rule out quantum mechanics at all.

Finally, even barring experimental inadaquecies of classical mechanics, there is no clear way to assert as to whether quantum mechanics requires fewer assumptions or classical mechanics. If anything, since classical mechanics is arrived at after assuming the criteria that make a quantum system approach its classical limit, one can argue that quantum mechanics requires fewer assumptions.

Addendum

Besides, the language used in the paragraph you quote is highly misleading. It's not true that you can't know the initial state of a quantum system precisely. You can absolutely know that. For example, the state of a spin half particle is precisely spin up in a certain direction if I measure it to be spin up in that direction (and I can do that). The story about knowing either the precise momenta or positions is a bit different because there is simply no physical state which has a definite position or momentum, however you can still specify the initial state (or any subsequent state) of a quantum particle perfectly precisely either by specifying its wavefunction in a certain basis (which you can do) or by specifying its quantum numbers with respect to some appropriate complete set of commuting operators (which is also something you can do).

  • 1
    the book is translated to English from German. I do not know whether something was lost in the translation. A word here or there makes lots of difference. – derint Jun 02 '20 at 20:29
  • @derint Ah, I see. Especially from a language like German where you have a much richer vocabulary than English, I can certainly imagine. –  Jun 02 '20 at 20:34
10

First of all, it is not true that the maximum of the probability always lies near the classical path - two-slit experiment or discrete energy levels are proves of that. When it is indeed the case, we call it quasi-classical approximation.

Secondly, even when the maximum of the probability is close to the classical path, we are still interested in more precise calculation - the above mentioned quasi-classical approximation is a method for calculating quantum corrections to classical behavior.

Finally, Occam's razor is an empirical principle, which is grounded in our intuition and on occasion in arguments following from the probability and information theory. Occam's razor is in no way a substitute for or a counter-argument against the experimentally verified laws of physics.

Roger V.
  • 58,522
10

Occam's Razor states that “entities should not be multiplied without necessity”. It is necessary for an electron to take multiple paths in order to explain the double slit experiment.1 This gives us the following model:

  • An electron takes all possible paths to the same place.
  • Each path contributes amplitude towards the electron reaching that place, such that the phase of the contributed amplitude is proportional to the path length.

Now, consider the "simpler" model, where long, crazy paths that go through Mars don't happen:

  • An electron takes all possible paths to the same place.
  • Each path contributes amplitude towards the electron reaching that place, such that the phase of the contributed amplitude is proportional to the path length.
  • Long, crazy paths that go through Mars don't happen.

You see that, even though it seems simpler, we actually have to add an extra thing to the theory. It takes more {axioms, information, entities, code, assumptions} [take one] to describe. And so Occam's Razor doesn't favour this one.

I have not yet known Occam's Razor to fail. The simplest explanation consistent with these observations is that Occam's Razor always works… but if it were to fail, it would be necessary to choose an explanation with more entities to account for irritating reality.


1: Strictly speaking, it's just necessary for electrons to follow the Schrödinger equation, which is equivalent to a superset of this simplified explanation… but I'm confident that the argument can be made more rigorous without compromising its conclusion – or else one of the hundreds of university students who tried to find such a Razor violation would've spotted it. It'd just be harder to follow.

wizzwizz4
  • 511
  • "I have not yet known Occam's Razor to fail." - Occam's Razor cannot "fail", because the only way to fail is for a more complicated theory to prove correct over the simpler one. But to prove the more complicated theory correct is to find evidence that the simpler theory is wrong, and at that point the more complicated theory is now necessary, as per Occam's Razor. This has happened many, many times. – Paul Sinclair Jun 05 '20 at 16:40
  • @PaulSinclair No, you've internalised Occam's Razor, to the point where it just seems like "common sense". (Or maybe Occam's Razor is just a description of part of "common sense" – I only see correlation.) Imagine the Novacula Alphabetum, where the hypothesis that comes first, alphabetically (when written in Classical Latin) is usually correct (which holds true because an abacus drives off abbreviation). Or, say, Chekhov's Razor – the hypothesis with the least narratively-unnecessary fluff at the beginning and the end is most likely, because we're living in a storybook. – wizzwizz4 Jun 05 '20 at 19:31
  • Occam's Razor holds true, near as I can tell, because the universe runs on maths. There are imaginable rules of the universe where it doesn't hold… but Occam's Razor says that they're unlikely. – wizzwizz4 Jun 05 '20 at 19:35
  • Neither of your examples has any bearing on the argument I made at all. The argument is based on the "without necessity" clause in the Razor. Your alternate rules do not invoke necessity. The only way to know that the theory favored by Occam's Razor is wrong is to gather evidence against it. And in the presence of that evidence Occam's Razor by its nature changes its verdict. This does not render the Razor useless. It tells you where it is useful: in deciding between theories that equally fit the evidence. But its decision is not absolute, as no science should ever be. – Paul Sinclair Jun 05 '20 at 19:44
  • @PaulSinclair Novacula Alphabetum: “pages of the dictionary should not be turned without necessity” (translated from Classical Latin). Chekhov's Razor: “narratively unnecessary entities should not be included without necessity.” Why don't we use these Razors, if not that they simply don't work? The only way for them to fail is if a lexographically later theory / theory with more narrative fluff proves correct over the more Razor-preferred one. – wizzwizz4 Jun 05 '20 at 19:48
  • "Lexographically earlier in Classical Latin" is a really complicated theory, by Occam's Razor standards; it requires you to provide the entire Classical Latin dictionary, plus a definition of every word, plus the grammar rules that describe valid sentences… Likewise, Chekhov's Razor requires you to bring in the entirety of human storytelling; it's worse! Occam's Razor is a very simple Razor, so – in the absence of any evidence whatsoever that it's worse than another Razor with more entities in it – we should assume it's the Razor to use… by its own standards. It's self-endorsing. – wizzwizz4 Jun 05 '20 at 19:58
  • My question was inspired by Wolfgang Demtröder's comment that, " the final state of the system shows (even for accurate initial conditions) a probability distribution around a value predicted by classical physics." As I have learned now from the discussion here that the comment was someway misleading. But, the arguments for a simpler theory still stands. The motivation of Werner Heisenberg to introduce matrix mechanics was that we must exclude those things from a theory which can not be verified experimentally. They become a burden. This is an example of Occam's Razor at work. – derint Jun 07 '20 at 22:47
  • @derint The arguments for a simpler theory do indeed stand. In fact, there are some improvements that can be made to "Schrödinger's equation holds, except when things get big enough at which point the waveform collapses". – wizzwizz4 Jun 08 '20 at 08:21
  • @wizzwizz4 Thank you for the hope. – derint Jun 09 '20 at 12:46
8

The other answers have already pointed out the problems with your question, so I won't rehash those explanations here, but there is something that I believe should be said here about a matter that is often made unclear in treatments of quantum field theory in the path integral formalism by the copious use of Wick rotations (transforming the complex exponential into a real exponential by a rotation in the complex plane) in computations. The "Euclidean" partition function

$$ \mathcal Z = \int D \phi \exp \left(-\frac{S[\phi]}{\hbar} \right)$$

indeed has the property that contributions far away from the minimum of the classical actions are suppressed. For a free particle action $ S[\phi] = \int \dot \phi(t)^2 \, dt $, for instance, given by the usual kinetic term in one dimension, the path integral measure is just a Wiener measure, and correlation functions can be computed by assuming that the sample path of the particle follows Brownian motion. Doing this kind of computation carelessly leads to absurd results - for instance, violations of the uncertainty principle.

This is no longer true for the actual partition function

$$ \mathcal Z = \int D \phi \exp \left(\frac{i S[\phi]}{\hbar} \right)$$

for which the amplitudes away from the minimum of the action are not, a priori, suppressed. The absolute value of the amplitude of following any specific path is just $ 1 $, but the phase differences lead to cancellations for sample paths $ \phi $ very far from the classical one due to the fast oscillation of the complex exponential. By a classical argument in complex analysis (computation of Fresnel integrals using contour integration), we can legitimize the use of Wick rotations in quantum field theory, but while they make computations more convenient they can also obscure what the path integral actually means physically.

This point is of vital importance - what's illustrated in classical setups such as the double slit experiment is precisely phase cancellation among different sample paths in the path integral, which is what produces the interference pattern on the screen. An Euclidean partition function can't account for such behavior, so it's just not correct to naively assume that paths further away from the classical trajectory will be monotonously suppressed in the quantum theory as one goes further away from them.

There are also cases, as pointed out in the other answers, when the action itself is of comparable magnitude to $ \hbar $ and therefore the quantum corrections become significant. However, quantum effects can be seen easily in the macroscopic world. For example, due to quantum effects, we're not all fried by ultraviolet radiation emitted by the sun (as predicted by the Rayleigh-Jeans law), and also due to quantum effects you can use the computer at your disposal to post this message on this website (without the band gap structure of silicon and other semiconductor materials, which is a quantum effect, the integrated circuits on your processor chips would fail to function).

In all these cases there is something of the microscopic, of course (the Rayleigh-Jeans law only fails at small wavelengths and the band gap structure arises from a very fine scale periodicity in an ion lattice), but these fundamentally microscopic phenomena can easily be amplified into macroscopic behavior given the right conditions.

Ege Erdil
  • 541
5

I don't know what Demtröder meant but the sentence

Furthermore, the final state of the system shows (even for accurate initial conditions) a probability distribution around a value predicted by classical physics.

Is wrong.

Aspect's famous experiment on Bell's inequality gave a result (as predicted by quantum mechanics) 5 standard deviations away from that of any possible classical description

Aspect's experiment has been confirmed many times. Here is the link to the original paper (free to read)

lcv
  • 1,967
4

Further to the points discussed, the relevant principle when comparing theories of different empirical success isn't Occam's razor, but the correspondence principle. Originally that refers to quantum mechanics recovering classical mechanics in a certain limit, but more generally it means a new theory is accepted when it explains both the successes and failures of an old theory, so it's consistent with all observations and reduces to the old theory in its domain of validity. (For an illustration of when the principle isn't honoured, as a hallmark of pseudoscience, see this.)

J.G.
  • 24,837
4

There are a number of misconceptions about the path integral which fuel this question, and I think they are very common and worth discussing. First of all, we are discussing here quantum mechanics, which is non-relativistic, meaning it is not fundamentally accurate in a relativistic context (meaning at high energy, short lengths and/or short times).

Quantum mechanics is built on an assumption that there are fundamental degrees of freedom which are "point particles" (thus "mechanics", juxtaposed to "classical mechanics" of point particles), whose state is described at any instant by a wavefunction. To build quantum mechanics, we could assume there exists a set of position eigenstates, e.g. the state $\vert x \rangle$ represents a particle located at position $x$. The wavefunction now arises from the expansion of a generic state of the system in the position basis, $$\vert \psi \rangle = \int dx\, \vert x \rangle \langle x \vert \psi \rangle = \int dx\, \psi(x) \vert x\rangle$$ The wavefunction can, in principle, extend over (have non-zero support on) all possible position states, i.e. it is not bounded within a finite (local) region of space. The path integral arises when we ask the question: "assuming the particle is at location $x'$ at time $t'$, what is the probability amplitude that it is at location $x''$ at time $t''$?" The "standard" way to do this is in the Schrodinger picture, $$\langle x'',t'' \vert x',t' \rangle = \langle x'' \vert e^{-i\hat{H}(t''-t')/\hbar} \vert x' \rangle$$ in which we time evolve the wavefunction (or rather the state vector) according to the dynamics given by the Hamiltonian operator $\hat{H}$, with the condition that $\vert \psi(t'=0)\rangle = \vert x'\rangle$, then measure the projection along $\vert x''\rangle$. No reference to classical paths is necessary here.

The path integral is a reformulation which recasts this quantity as an average over all classical trajectories. But we need to keep in mind that, fundamentally, according to quantum mechanics a particle can never be perfectly localized and classical trajectories do not actually exist, outside of a formal mathematical definition. The path integral says that the above amplitude can be rewritten (under certain assumptions about $\hat{H}$) as $$\langle x'',t'' \vert x',t' \rangle = \int_{x(t')=x'}^{x(t'')=x''} \mathcal{D}x \,e^{iS[x(t)]/\hbar}$$ where the integral is over all possible paths with the correct endpoints, and $S$ is the classical action. It is true in simple cases, such as the free particle, that the dominant contribution is from the classical path, and one could, perturbatively, treat the subleading contributions from paths "near" the classical one, i.e. we could rewrite it as $$\langle x'',t'' \vert x',t' \rangle = \int_{\delta x(t')=0}^{\delta x(t'')=0} \mathcal{D} \delta x \, e^{iS[x_c(t) + \delta x(t)]/\hbar}$$ where the integral is over all possible deformations of the classical trajectory with fixed endpoints.

Perturbatively, the leading order contribution is from the classical trajectory, and subleading contributions come from small deformations near the classical trajectory. The integral can be done exactly in very simple cases, but generally would be expanded as a saddle point integral, where we assume $\hbar$ is small (i.e. quantum effects are small). This means that we expand the action in powers of $\sqrt{\hbar}\,\delta x$, $$S[x_c + \delta x] \sim S[x_c] + 0 + \frac{\hbar}{2}\,\left.\frac{\partial^2 S[x_c+\delta x]}{\partial (\delta x)^2}\right|_{\delta x = 0} (\delta x)^2 + \mathcal{O}(\hbar^{3/2})$$ where $\partial$ here denotes a functional derivative, and the first derivative is zero since we are expanding about an extremum of the action. The path integral is then $$\langle x'',t'' \vert x',t' \rangle = e^{iS[x_c(t)]/\hbar} \int_{\delta x(t')=0}^{\delta x(t'')=0} \mathcal{D} \delta x \, e^{i S''[x_c] \delta x^2/2}\, e^{i \mathcal{O}(\hbar^{1/2} \delta x)^3/\hbar} \\ \qquad\qquad\,\,\,\qquad\qquad= e^{iS[x_c(t)]/\hbar} \int_{\delta x(t')=0}^{\delta x(t'')=0} \mathcal{D} \delta x \, e^{i \delta x^2/(2S''[x_c]^{-1})} \left[1 + \mathcal{O}(\hbar^{1/2} \delta x^3)\right]$$ averaging over gaussian fluctuations near the classical extremum of the action. The classical limit is $\hbar \rightarrow 0$, i.e. there are no quantum fluctuations, all of the terms in the integral are zero except the first which is a constant, and the only path that contributes is the classical one, in which case the transition amplitude is simply $\langle x'',t'' \vert x',t'\rangle \propto \exp(iS[x_c]/\hbar)$. In the case that $\hbar$ is small, we can say that the dominant quantum corrections come from paths near the classical trajectory, and in fact, we can discard the highly-non-classical ones, such as the one where it visits Mars, which contributes negligibly, very roughly speaking on the order of $\sim\exp(-\delta x_{\mathrm{Mars}}^2)$, $\delta x_{\mathrm{Mars}}$ being the difference between the classical trajectory and the Mars trajectory.

However, it is not guaranteed that such a perturbative expansion will converge! The path integral is not a cure-all which can always solve any given problem in quantum mechanics. It is possible that quantum fluctuations are so strong (formally, when $\hbar$ is very large) that this perturbative expansion over classical trajectories is ill-defined and the classical path does not significantly characterize the quantum system under study. This would be the case, for example, if we found that the trajectory that visits Mars contributes significantly to the integral. In condensed matter physics, this is closely related to the phenomenon of "quantum melting", and in high-energy physics is related to strong interactions and phenomena such as confinement. These are examples of times where such perturbative approximations fail, because there are non-perturbative aspects of the systems (e.g. topological) which are not captured by any power series expansion (e.g. we have ignored "instanton" contributions to the calculation, terms such as $\exp(-\hbar/f[\delta x])$, which cannot be expanded in a power series in small $\delta x$.).

The fundamental quantum theory which is currently accepted is quantum field theory, which is fully relativistic and in which the fundamental degrees of freedom are not particles but quantum oscillators (fields) that permeate spacetime. At low energies and (relatively) long length scales, the excitations of these fields are "particles". At high energies, short lengthscales, or strong interaction contexts, particles themselves are ill-defined, in the same way as the classical paths are ill-defined when quantum fluctuations are large. The particle picture is useful for certain calculations in a weak-coupling, low-energy limit, but generally fails when considering strong coupling. As a simple example, we are often told a story about how a proton is made of two up quarks and a down quark. While the up and down quarks are fundamental excitations of quark fields, the picture that a proton literally contains two ups and a down (the parton model, also due to Feynman) is imperfect. At best, we can say that the proton contains "on average" two ups and a down, along with a sea of virtual particles coming from strong quantum fluctuations. One can perform diagrammatic expansions in the parton picture to describe the proton, akin to the classical trajectory expansion of the path integral, but this expansion will not capture much of the important physics of the proton, such as the fact that the quarks in a bound state or that they are confined in color-neutral combinations.

My point in all of this is, do not put too much weight on the classical trajectory (or virtual particle) interpretation of the path integral, because it is only an approximation which does not necessarily reflect reality. In the virtual particle (Feynman diagram) expansion in QFT, we have to include diagrams that have up to an infinite number of particles being created and annihilated in a simple electron-electron collision. Does that mean that every time two electrons collide an infinite number of particles actually appeared and disappeared? Of course not, it means that we have applied a particular approximation to solve the problem, and interpreting that approximation literally gets us into trouble. The same goes for the classical trajectory picture of the path integral: while the picture it paints in our head is nice for interpreting what it means, the actual physical situation generally does not reflect that picture, because physically there is no such thing as a perfectly localized particle which traces out a classical trajectory in space, and such pictures are only approximately true when quantum effects are weak and the classical contribution dominates.

The path integral is useful for computing averages, but the actual physical picture would involve understanding dynamics, i.e. how the system evolves over time. In the classical path picture, the reason why the path to Mars is included is that the wavefunction can extend all the way to Mars in standard quantum mechanics (ignoring relativity!). Relativistically, the wavefunction spreads over time in a causal manner. The key lesson is this: The path integral is a useful calculational tool, but be very cautious in trying to interpret it as a literal description of reality.

Kai
  • 3,680
  • 1
  • 13
  • 27
0

Feynman thought his path integral formulation might be conceptually useful, but it is a matter of opinion whether it actually is (personally I don't find it helpful). If you want to apply Occam's razor, then at least do it to the mathematical structure defined by Dirac and von Neumann. This is expressed in Wikipedia with only three axioms. You could hardly get simpler than that, so (for what it is worth) Occam's razor clearly supports quantum mechanics. If you are talking of interpretations, then again von Neumann's approach could hardly be simpler as I have explained in The Hilbert space of conditional clauses.

Charles Francis
  • 11,546
  • 4
  • 22
  • 37
-2

I hate to say it, but if you apply Occam's Razor to Quantum Mechanics you wind up throwing out 90% of Quantum Mechanics. That's because Quantum Mechanics is far too complicated to be the "simple solution" that Occam's Razor is looking for.

A far simpler explanation for small-scale physics is Paul LaViolette's Sub-Quanum Kinetics. If you are truly applying Occam's Razor, than the inevitable conclusion is that Quantum Mechanics is wrong and SQK is right.

But of course people are far too accepting of things like the Copenhagen Interpretation to consider a fresh approach.

If someone asks me nicely I'll post a summary of SQK.


"OK, can you kindly provide a more detailed explanation of SQK as an addendum to your answer? That'd be really helpful. – Dvij D.C."

How could I refuse such a polite request? Okay, here it is. But first a disclaimer and some links:

I personally am not attempting to claim SQK is the right theory, just that it is another theory, and that by being simpler, it is more "Occamable" than QM.

LaViolette's book: https://www.amazon.com/Subquantum-Kinetics-Paul-LaViolette/dp/0964202573/ref=sr_1_4?dchild=1&keywords=paul+laviolette&qid=1591380871&s=books&sr=1-4

One of LaViolette's papers, as published in The International Journal Of General Systems: https://www.tandfonline.com/doi/abs/10.1080/03081078508934920?casa_token=vI8AdTnpSQoAAAAA%3ADmbRr0vVy3wWc3xIXIZjFj1GHHFakCU9BnwwOlqW-yzBM5S4veaxP1IHk8SuY1qbBjb6er0mC6Wt&

SUMMARY OF PAUL LAVIOLETTE'S THEORY OF SUB-QUANTUM KINETICS

as summarized by Jennifer Freeman

The Theory Of Sub-Quantum Kinetics holds that the entire universe (matter and energy) consists of 3 elementary particles called G, X, and Y; and a series of supporting elementaries including A, B, Z, and Omega (hereinafter referred to as O).

The A, B, Z, and O particles in themselves do not manifest the physical universe but are necessary to its creation; for as particles interact and transform themselves from one type to another all matter and energy come to exist.

If physical space is devoid of G, X, and Y particles, or if there's a uniform distribution of each of them throughout space, then there can be no matter or energy. Matter and energy can only exist when there is a nonuniform distribution of G, X, and Y. Fortunately, this is easy.

In LaViolette's "Model G" there are 5 primary reactions that occur among these particles:

                    A    ->  G
                    G    ->  X
                    B+X  ->  Y+Z
                    2X+Y ->  3X
                    Y    ->  O

Each arrow represents a probability that the inputs will spontaneously change to the outputs. (Each reaction can also occur in reverse, but these probabilities are much less.) As a result, these transitions are both continuing and nonuniform. This nonuniformity is the key to creation.

It is the supply of A and B particles that drives this whole process. These particles are seemingly inexhaustable. La Violette states that A particles come from A' particles transforming into A, and that A' particles come from A'' particles transforming into A', and so on ad infinitum. There is a similar infinite chain supplying B particles, and infinite disposal chains cleaning up the waste particles, O and Z:

    ...  A''  ->  A'  ->  A  ->  G  ->  X  <-----  Y  ->  O  ->  O'  ->  O''  ...
                                         \->    ->/
                                            \  /
    ...  B''  ->  B'  ->  B  ----------------==-------->  Z  ->  Z'  ->  Z''  ...

One reaction listed above is the conversion of A into G. While A particles are essentially nonmaterial in the usual sense, a nonuniform distribution of G is what creates gravity. Thus a gravitational field can occur without any matter nearby, although it is unstable.

Quite possibly the first stable particle of matter was a proton. A proton consists of a dense concentration of Y, surrounded by a shell of X, surrounded by a shell of Y, surrounded by a shell of X, and so on. While the X's and Y's are particles, the pattern they form is wavelike, giving protons (and other subatomic particles) their wavelike characteristics.

Protons are quite stable, and they have mass, which means they create gravity fields of their own (due to the effect they have on nearby G particles). And gravity fields do encourage particles to remain stable, so the entire unit tends to be self-perpetuating.

Anti-protons, while possible, do not have the stability of protons and so decay quickly. Therefore, the universe is made up primarily of matter and not antimatter.

Protons and their associated gravity fields encourage other particles to form and remain stable. While a proton in no way "gives birth" to other particles, the zone of gravitic stability surrounding them does lead to additional creation.

An electromagnetic wave is nothing more than a periodic variation of X and Y in one place, nudging the same variation to occur in an adjacent place, and so on and so on, causing the wave to "travel".

Creation of matter and energy, in contradiction of the Laws Of Thermodynamics, continues today, and is most intense in areas of strongest gravity. Sub-Quantum Kinetics holds that there was no "big bang", the universe is not expanding, and that the red shift of light from distant galaxies is due to the light itself changing as it travels through the interstellar void (the "tired light" theory). It also says the universe is not "running down" because entropy can decrease as well as increase, that there is no black hole in the center of our galaxy (only a huge mass where a lot of creation is going on), and that there is no missing "dark matter".

Sub-Quantum Kinetics does not accept the Copenhagen interpretation of the Heisenberg Uncertainty Principle. That is to say, according to LaViolette every particle has a definite position and velocity, which is uncertain (to us) only because of our crude measuring tools. The process of observing an experiment does not cause it to manifest one way or another (although the way we observe it does influence what we can see).

Jennifer
  • 123
  • 4
  • 6
    It's not a useful answer if it name-drops a not-so-well-known proposal without explaining what it entails, especially when not even a Wikipedia page shows up upon a Google search of the name. In addition, PSE exclusively deals with mainstream physics, does the proposal you mention fall under this criterion? –  Jun 03 '20 at 15:38
  • @DvijD.C. Without explaining what it entails? Like I said, I'd be glad to explain it but your comment doesn't qualify as asking me nicely. Furthermore, if QM ever does turn out to be wrong, you'd never know this because you're blind to possible alternatives. – Jennifer Jun 04 '20 at 13:50
  • 4
    If your answer was meant to be a comment that you'd like to provide a comprehensible answer upon nicely worded request, you should have posted it as a comment on the question--not as an "answer". Usually, a well-formulated question post is in itself understood to be a nicely worded request to provide comprehensive answers. –  Jun 04 '20 at 14:12
  • Yeah, this is a bit bizarre. The question is about "reconcile this perceived fracture of Occam's Razor to this aspect of Quantum Mechanics." The answer posted here is, "That's because Quantum Mechanics is wrong, Theory XYZ is correct instead." Okay... except you're not actually giving arguments as to why. Instead, you're denouncing everyone that doesn't already agree with your conclusion as "blind" and "far too accepting." – Kevin Jun 04 '20 at 19:37
  • @DvijD.C. I'm not surprised that I'm being attacked for giving a sensible, if not exactly popular answer. Why bother explaining something if people aren't listening? My offer stands: I'll explain more about SQK as I have understood it if someone asks me nicely. – Jennifer Jun 05 '20 at 13:57
  • @Kevin I never said Quantum Mechanics is wrong. I did say that Occam's Razor works contrary to QM because OR looks for a simple explanation and QM has never been simple. SQK, on the other hand, is simple so there's a good chance it may win out in the end. Supporting QM simply because it is QM and already has a lot of supporters is like supporting the flat-earth theory back in the day. – Jennifer Jun 05 '20 at 14:01
  • OK, can you kindly provide a more detailed explanation of SQK as an addendum to your answer? That'd be really helpful. –  Jun 05 '20 at 14:25
  • 2
    Okay, you got me to bite. Googled the author to see what the heck is going on. In 'Genesis of the Cosmos', he wrote: “…the Tarot metaphorically encodes the same process-based creation metaphysics conveyed in the myth of Osiris… [With an understanding of] the emergence of ordered patterns in non equilibrium systems, we can now for the first time resurrect the Tarot’s ancient wisdom” He rejects relativity, restores absolute Newtonian spacetime, and reintroduces the concept of an 'ether'. Does... SQK have any sort of peer-review, reproducability, or such? Sounds kinda conspiratorial, tbh. – Kevin Jun 05 '20 at 14:55
  • 2
    ... and he claimed he was able to set up artificial gravity fields via 'Electrogravity' in 1993, with the US military using it in the B2 bomber. I'll give him this, though: he's got the most awesome name for a scientific abstract paper that I've ever seen: "The U.S. Antigravity Squadron". – Kevin Jun 05 '20 at 15:27
  • 2
    But, yeah, it's pretty much 'Conspiracy Science' - his book in 2008 was claiming that there's a huge NASA cover-up blocking the knowledge of antigravity propulsion that's thousands of times better than jet engines, which uses Subquantum Kinetics and Electrogravity. In his 2006 book, that pulsars are actually a galactic alien communication network and that crop circles and alien beaming technology are real. – Kevin Jun 05 '20 at 15:27
  • @DvijD.C. How could I refuse such a polite request? Okay, I added it to my post (above). – Jennifer Jun 05 '20 at 18:20
  • @Kevin and others -- I've never been a fan of Genesis Of The Cosmos. A book I liked much better was Secrets Of Antigravity Propulsion where he describes apparati most reasonably scientifically competent people can build. Then you can see for yourself whether its possible to have an action without an equal and opposite reaction (oh my gosh! heresy! somebody ban this *****!) But seriously folks, you can try it and see. – Jennifer Jun 05 '20 at 18:40
  • 2
    I think it's kinda ironic that this question is about Occam's Razor, and we've ventured into "A public book holds a schematic for a thousand-fold improvement on engine design that 'any reasonably scientifically competent person can build', but the reason we don't see them in use everywhere is because every single one of thousands of engine manufacturers all decided they hate profits and would rather forego trillions of dollars so they could be in a coverup"... being a more likely theory than "the guy that wrote a book about crop circles, alien messages, and tarot wisdom is wrong." – Kevin Jun 05 '20 at 21:26
  • 1
    Merits of this alternate theory aside, it's unclear how this answers the question in the OP. – Chris Jun 07 '20 at 02:46
  • Need I say it yet again? Occam's Razor says the simplest explanation is most likely the correct one, and Quantum Mechanics is anything but simple. I intended to present Sub-Quantum Kinetics as only being simpler than Quantum Mechanics, but various people challanged me to the point where I had to defend it. I was led down a path that I never intended to travel. My original point stands: if you apply Occam's Razor to Quantum Mechanics, you'll wind up throwing out 90% of Quantum Mechanics. – Jennifer Jun 07 '20 at 04:30