Celebrating David Deutsch’s 70th Birthday. In memory of our mutual supervisor Dennis Sciama and our graduate student colleague John Barrow, latterly Professor of Cosmology at DAMTP.
For all your incredible achievements, David, I have beaten you to one thing, albeit only by a few months: reaching the ripe old age of 70. I have followed your illustrious career since we first got to know each other as graduate students in the Sciama Hut in the 1970s. It’s amazing to reflect on how many of us in the Hut subsequently became Fellows of the Royal Society - our mutual supervisor Dennis Sciama certainly had a golden touch. Indeed this was also a golden age for black hole physics and it was exiting to see ideas around Hawking evaporation and black-hole thermodynamics take shape in real time. We almost wrote a paper together on transfinite causality conditions in relativistic space-times. I suspect the fact we didn’t quite finish the paper reflects badly on me: in truth, I was getting a little outside my zone of mathematical competence. In the end we both completed our theses in somewhat different fields: you on quantum field theory in curved space- time, me on the use of bitensors (more generally tangent-bundle geometry) to formulate a generic quasi-local solution - maybe the first, I’m not sure - to the gravitational energy momentum problem in general relativity.
Whilst Dennis was always a source of inspiration, I suspect his light-touch approach to me was not dissimilar to that for you - he let us follow our own noses. However, I remember one day when he did try and steer me in a certain direction. He came into my office telling me how frustrated he was that Hawking’s paper on black-hole evaporation was so arcane - all those hypergeometric Green’s functions that somehow magically turn out to describe Planckian radiation when evaluated at future null infinity. There must be a simpler way to show this, he said to me, as I am sure he must have said to you. And then one day, Dennis more or less instructed me to find out if the Principle of Maximum Entropy Production was going to provide the route to a more intuitive understanding of Hawking radiation. I didn’t like to tell him I had never heard of said principle, and disappeared into the Radcliffe Science Library to find some dusty old books on non-equilibrium thermodynamics. I got absolutely nowhere with the problem.
However, this episode teed me me up for a life-changing experience. Although I had been successful in applying for a postdoc position to work in Hawking’s group at DAMTP, for a number of reasons I had a slight nagging doubt whether this was the path I should be taking. By complete chance I met Raymond Hide, an ex President of both the Royal Astronomical Society and the Royal Meteorological Society. I asked him what was new in climate science and he told me about a paper which had excited him, showing how properties of Earth’s climate were derivable from the Principle of Maximum Entropy Production. It felt like some higher power was telling me what to do with my life. So, after much angsting, I turned down the Cambridge offer.
As it happens the Principle of Maximum Entropy Production has had as much influence on my climate career as it had on my doctoral research in GR – zero. However, what has had an influence is chaos theory, which I came to learn a lot about after my switch to climate physics. I feel very strongly [15] that the key discovery of MIT meteorologist Ed Lorenz in the 1960s was not so much the butterfly effect (which everyone knows about now, even Gwyneth Paltrow), but the notion that simple sets of nonlinear differential equations – ones that Isaac Newton would have readily understood – could generate a state-space geometry, fractal geometry, that would have been utterly alien to Newton. Indeed, the state-space geometry of these chaotic systems is formally uncomputable [4] and undecidable propositions like Hilbert’s Halting Problem can be reformulated as a problem in fractal geometry [6]. Indeed p-adic numbers, so vital in number theory these days, are to fractal geometry as real numbers are to Euclidean geometry [10].
In 1987, the 300th Anniversary of Newton’s Principia, Stephen Hawking and Werner Israel edited a book comprising a series of contributions by relativists including one by our mutual Oxford teacher and mentor Roger Penrose. I remember reading Roger’s chapter in Blackwell’s in Oxford. Although I had left the rarified heights of GR 10 years earlier, Penrose’s article brought back my old world to me in an instant. Roger’s article mentioned Bell’s Theorem, and I remember thinking that that was something I never really got to the bottom of when I was Dennis’s student. I started reading a few pedagogical books and papers on the subject in my spare time, and realised that applying ideas from the fractal geometry of chaos could lead to a completely novel interpretation of Bell’s Theorem. I published a paper in Proc. Roy. Soc. [12] claiming that one could formulate a locally causal model of quantum spin, and got back to my day job.
From time to time I would try to discuss my ideas with quantum foundations experts, and to my surprise I found it very difficult to persuade them that my approach to the interpretation of Bell’s Theorem was viable. This difficulty continues to the present day. Since I have been thinking about it for so long now, I have been able to refine the argument down to a few basic points. Let me try my idea out on you. Tell me what you think!
So, I claim that quantum physics is underpinned by some deterministic locally causal model. You may respond that if that were so, Bell’s inequality would not be experimentally violated. I will reply that this does not follow because another assumption is needed to derive Bell inequalities. This is usually referred to as Statistical Independence. You might respond (as many do) that violating Statistical Independence would signal into existence some crazy conspiratorial processes (e.g. that the experimenters’ minds are under the control of the particles’ hidden variables) that would effectively signal the end of science as we know it. Well, I’m afraid I have to tell you that I think that argument is complete nonsense and this is why.
First off, I think Statistical Independence is is a lousy description of the technical assumption ρ(λ|XY ) = ρ(λ) in Bell’s Theorem, which I would rather describe as the Counterfactual Definiteness assumption. Here, as you know, λ is a hidden variable and X, Y ∈ {0, 1} are Alice and Bob’s nominal measurement settings.
To see how Counterfactual Definiteness plays a role in Bell’s Theorem, suppose Alice and Bob estimate experimentally the four individual correlations in (the CHSH version of) Bell’s inequality, on Monday, Tuesday, Wednesday and Thursday respectively. Now consider the question: could Monday’s particles have been measured with Tuesday, Wednesday and Thursday’s polariser set- tings? Now manifestly you can’t actually perform these measurements since Monday’s particles were absorbed by the measurement devices on Monday - they aren’t available to be measured on Tuesday or later. That is to say, these are counterfactual measurements, and so the answer to the question depends on whether these counterfactual measurements are consistent with the laws of physics, as encoded in one’s putative deterministic locally causal model of quantum physics. Importantly, it must be that all three of these counterfactual measurements are consistent with the putative deterministic locally causal hidden-variable model, for the model to satisfy the Bell inequality.
A traditional classical hidden variable model assumes a formula which, for a given λ, spits out a spin value for any measurement setting input. For such a model, the outcome of these counterfac- tual measurements are indeed necessarily well defined. That’s what most people think must be the case when they consider hidden-variable models. So my concern about the well definedness of coun- terfactuals leaves them puzzled. What sort of deterministic model could deny the well-definedness of such counterfactuals?
I don’t want to consider a traditional hidden-variable model. Instead I want to consider a non-classical deterministic hidden-variable model. I call this model non-classical because fractal attractors were discovered long after quantum mechanics. So, let’s imagine that the universe is itself a deterministic dynamical system evolving on some fractal chaotic attractor (for the present purposes, a complicated measure-zero dynamically invariant set) – I don’t think there are any cosmological observations that would flat-out contradict such an assumption. If the universe was evolving on such a measure-zero set, then a counterfactual measurement would be inconsistent with the putative laws of physics if the associated counterfactual state of the universe lay in a fractal gap, off the invariant set. What I was able to show with a plausible model consistent with this type of dynamics (see [13] [14] [8] but I am improving the rigour of the argument considerably in a new paper) is that at least one of the three counterfactual measurement scenarios (Monday’s particles – Tuesday, Wednesday or Thursday’s measurement settings) lies off the invariant set and hence is inconsistent with our putative laws of physics.
At this point, you may start to worry whether experimenters have the freedom to choose their measurement settings, according to this model. Experimenters hate to be told they are not free to chose how to do their experiments! As mentioned, they bring up all the old arguments that if they are not free, then the world has to be conspiratorial which they (rightly of course) find unacceptable. My response to this is: Whoa, hold your horses – who said you are not free to choose? In my model, the nominal measurement settings X, Y are completely under the control of the experimenters: for example they may have engineered pseudo-random number generators with buttons which they can press to reveal either a 0 or a 1, or they can look at the Dow Jones index to determine what values of X and Y to use. I don’t care how the choices are made. However, in my model, it is super-vital to distinguish between nominal measurement settings and exact measurement settings. Suppose Alice and Bob press their buttons to set up a run with the X = 0, Y = 0 measurement settings. Because they are nominal settings, there is for all practical purposes an infinite set of possible exact settings consistent with each of the nominal settings. For any particular run of the experiment, the exact settings will be sensitive to the phase of the moons of Jupiter [2] or to any gravitational waves that might be passing.
In my model, whatever the nominal settings X and Y , the exact settings must satisfy what I call a rationality constraint: specifically that the cosine of the angle between Alice’s exact setting and Bob’s exact setting must be a rational number. Importantly, even though they have control over the nominal settings, Alice and Bob have no control over this rationality constraint; it is an unavoidable consequence of my putative law of quantum physics. This condition is not a manifestation of nonlocality because, since it is intrinsic to the putative geometric laws of physics, the rationality constraint cannot be violated by events which happen in space time. That is to say, despite the rationality condition, the model satisfies the local causality condition in space-time, whereby Bob’s spin up/down measurement outcome never depends on Alice’s measurement setting X (and vice versa). Instead, the rationality condition describes a global geometric constraint in state space rather than space-time. This does have major implications for how we should formulate our laws of physics, which I discuss below.
In considering the counterfactual measurements where Alice and Bob selected other than they did (i.e. other than X = 0, Y = 0 in the case above), we keep the particles’ hidden variables fixed and vary X and Y . But what are these hidden variables? In my model, the hidden variables for any one of Alice’s particles correspond to one (from an effectively infinite set of) possible pair of exact settings, relative to the nominal settings X = 0 and X = 1. Similarly for Bob’s particles. To repeat, these hidden variables represent degrees of freedom (like the phase of the moons of Jupiter or gravitational waves from from distant black-hole merger) over which the experimenters have no control whatsoever. Just as the phase of the moons of Jupiter do not force Alice and Bob to choose which button to press, neither do the hidden variables.
If this sounds like a strange idea, think about a newborn baby. Will this baby, in later life, climb Mount Everest or win the Nobel Prize? Some of the information needed to answer these questions will be encoded in the baby’s DNA – that’s a bit like a traditional hidden-variable model where the hidden variables are somehow localised to the particle. But most of the information needed to determine the baby’s achievements won’t be so encoded; they will depend on chance encounters in life as to whether the baby ends up climbing Mount Everest or winning the Nobel Prize. All one can say (assuming a relativistic deterministic world) is that the information needed to answer these questions about the baby lies in the intersection of a spacelike hypersurface going through the birth event, with the interior of the past light cone of the baby’s death event. Similarly here: the support of a particle’s hidden variables at the time the particle is created lies in the intersection of a spacelike hypersurface going through the creation event with the interior of the past light cone of the particle’s measurement event. Again, all locally causal and hunky dory (and, by the way, not requiring any notion of retrocausality).
We now come to a non-trivial pivotal result which is a consequence of number theory, specifically Niven’s Theorem [11] which states that the cosine of a rational (in degrees) angle is almost always irrational. From this and the rationality constraint, we can prove the following: take four points on the sphere which we will call 0A, 0B, 1A and 1B and join the four pairs (0A,0B), (0A,1B), (1B,0A) and (1A,1B) by four great circles. Here 0A corresponds to the exact setting, consistent with Alice’s nominal measurement setting X = 0 and a given fixed value of λ etc. By Niven’s Theorem, it is impossible for the cosine of the four angular distances to all be rational. Just to be clear, it is easy to find three pairs of points (one actual and two counterfactual) which satisfy the rationality constraint. But not four pairs of points. From this we can deduce that at least one of the three counterfactual pairs of measurements that contribute to the CHSH inequality for a fixed λ is inconsistent with the rationality constraint and hence with our putative deterministic locally causal laws of physics. Because of this, it is impossible for our deterministic locally causal theory to satisfy Bell inequality. Once again, I stress that in this model, the selection of the nominal measurement settings is under the full control of the experimenters. There simply are no weird ‘alien mind control’ conspiracies which prevent them from choosing as they like.
Now you may ask, yes this is all very well, but does your model satisfy the Tsirelson bound? I will answer yes it does, and say that the rationality constraint arises when you discretise complex Hilbert Space in a certain way consistent with my invariant set model. That is to say, in my model, the complex Hilbert Space of quantum theory arises as an approximation. Experimentally it is a good approximation, indeed as good as you like by making the descretisation scale sufficiently small. However, theoretically, continuum Hilbert Space is a singular limit [3] – and not a smooth limit – as the discretisation scale does to zero. I won’t dwell on this here as it is something I am trying to write up in the coming weeks. But the bottom line is, yes, my model satisfies the experimentally determined Tsirelson bound.
So what’s the big message behind this interpretation of the violation of Bell inequalities? It is not that the world is indeterministic or nonlocal (i.e. not locally causal). And we don’t need wormholes to somehow short-circuit long-distance correlations. The implications are even more astonishing than ‘mere’ indeterminism, nonlocality, retrocausality or wormholes, since they have major implications for how we should be looking for a theory of quantum gravity. The implications are that what I would call ‘spatial reductionism’ – that to get a more fundamental perspective on the laws of physics, we must look at processes on smaller and smaller scales – may actually be wrong, even though this principle has held us in good stead over past centuries of scientific research. For example, there is a belief that once we can probe the Planck scale experimentally, we’ll finally be able to understand quantum gravity. I don’t believe this. It may instead be that the fundamental laws of quantum gravity are as much based on equations for the state-space geometry of the universe as a whole, as at the Planck scale. In some sense the Planck scale may simply be the Yang as the universe as a whole is the Yin. Put another way, the laws of physics may ultimately turn out to be as much top-down as bottom-up [14]. Julian Barbour [1] and George Ellis [7] think similarly about this (even though they may not endorse the details on my invariant set model) – what is it about us old timers?! (Gosh, how I would love to know how these ideas fit into your Constructor framework. From what I have read and heard, I think there must be some links.)
Looking back at the rather disparate topics that I have worked on over my scientific career, I think the one thing that they have in common is nonlinearity. Whether working on gravitational energy-momentum in GR, discovering the world’s largest (Rossby) breaking waves in the strato- sphere, developing ensemble-based estimates of weather and climate predictability, or proposing this particular interpretation of Bell’s Theorem, nonlinearity has been central to my research career. I do believe strongly that nonlinearity makes things conceptually simple (even though it may make things more computationally difficult).
However, regarding Bell’s Theorem, you and I may disagree. You are a proponent of the Everettian interpretation ([5]), which takes the linear Schr ̈odinger equation quite literally. There are some things about Everett which appeal to me. One area where I agree with you 100% is that the physical resource that quantum computers tap into to get their exponential advantage over classical computers is the physical reality of parallel worlds. However, you may argue that since the Schr ̈odinger equation is linear and is well verified by experiment, how can I claim that the quantum world is nonlinear? I would respond that in classical chaos theory, the Liouville equation for evolution of probability is precisely linear in probability density (as it must be if probability is conserved), even though the probabilities are themselves generated by ensembles of states which individually evolve under nonlinear deterministic dynamics. The close formal similarity between the classical Liouville equation and the Schr ̈odinger equation (for Hamiltonian systems at least) screams out to me that there must be a nonlinear deterministic dynamic underpinning the Schr ̈odinger equation. For that reason, I think that the Everettian interpretation is ultimately wrong: the Schr ̈odinger equation simply cannot be the last word on the subject and should not be taken literally. For example, in invariant set theory, there is no splitting or branching of worlds; they merely diverge exponentially on the invariant set as a result of what we might call decoherence.
After our PhDs (D.Phils) our paths diverged. Your pioneering work on quantum algorithms has created a new field of technology. My work in weather and climate physics, whilst not as groundbreaking, has led to a new way of making predictions which is having an influence around the world in the way humanitarian agencies respond to possible extreme weather events . By having ensemble-based quantitative estimates of forecast uncertainty, they can now decide when to take Anticipatory Action, sending food, medicine shelters and finance ahead of a natural event hitting some region. This is so much better than the old days when these agencies would simply wait for the weather event to hit (because single deterministic forecasts were too unreliable). I describe this, along with my interpretation of Bell’s Theorem, in my popular book The Primacy of Doubt [15], which I hope will be successful, but will never rival the run-away success of The Fabric of Reality [5].
But we may be converging again – I seem to be returning to my graduate-student roots as I get older and older. Didn’t Shakespeare have something to say about that in his ages-of-man speech? Since our days in the Sciama Hut, the fields of foundations of quantum physics, and arguably quantum gravity, have advanced only modestly. A few weeks ago I watched a YouTube video made in 1986 where Dennis Sciama and Ed Witten were discussing the then nascent string theory (Witten had just won the Dirac medal at ICTP). Dennis was sceptical that advances in basic physics could be made based purely on mathematical elegance – a theme developed by Sabine Hossenfelder [9] – and reminded Witten that the original basis for GR was not pseudo-Riemannian geometry but thinking about hypothetical elevators in deep space. Ed clearly felt differently and expressed the hope that quantum theory would somehow be emergent from the elegant mathematics of string theory. Well that hasn’t happened! I think Dennis was bang on the money. I’m glad I made the switch to climate science when I did, as I suspect, had I gone to Cambridge, I would have got embroiled in the nitty mathematical gritty of string theory’s predecessor supergravity, or something like that, and I would not have made any significant progress at all. After all, I didn’t know anything about fractal geometry in those days – it took a switch to an applied-science field to learn it. You have been successful by also decoupling yourself from mainstream academia. Good on you! Is there something to learn from this?
I’ll leave you with one of my favourite quotes from the always-inspirational Roger Penrose [16] - who, as you well know, became a relativist (and as a result won the Nobel Prize) by meeting Dennis in Cambridge:
My own view is that to understand [so-called] quantum non-locality we shall require a radical new
theory. This theory will not just be a slight modification of quantum mechanics but something as
different from standard quantum mechanics as General Relativity is different from Newtonian gravity.
Yep! I agree with that. You? Happy Birthday, David.
References
[1] J. Barbour. Quantum without Quantum. This Volume, 2023.
[2] J.S. Bell. Free variables and local causality. Dialectica, 39:103, 1985.
[3] M. V. Berry. Singular limits. Physics Today, 55:10–11, 2002.
[4] L. Blum, F.Cucker, M.Shub, and S.Smale. Complexity and Real Computation. Springer, 1997.
[5] D. Deutsch. The Fabric of Reality. Penguin Books, 1998.
[6] S. Dube. Undecidable problems in fractal geometry. Complex Systems, 7:423–444, 1993.
[7] G.F.R. Ellis. Top-down causation and quantum physics. Proceedings of the National Academy of Sciences, 115:11661=11663, 2018.
[8] Jonte R. Hance, Sabine Hossenfelder, and Tim N. Palmer. Supermeasured: Violating bell- statistical independence without violating physical statistical independence. Foundations of Physics, 52(4):81, Jul 2022.
[9] S. Hossenfelder. Lost in Math. Basic Books, 2018.
[10] S. Katok. p-adic Analysis compared with Real. American Mathematical Society, 2007.
[11] I. Niven. Irrational Numbers. The Mathematical Association of America, 1956.
[12] T.N. Palmer. A local deterministic model of quantum spin measurement. Proc. Roy. Soc., A451:585–608, 1995.
[13] T.N. Palmer. Discretization of the Bloch sphere, fractal invariant sets and Bell’s theorem. Proc. Roy. Soc., https://doi.org/10.1098/rspa.2019.0350, arXiv:1804.01734, 2020.
[14] T.N. Palmer. Bell’s theorem, non-computabiity and conformal cyclic cosmology: A top-down approach to quantum gravity. AVS Quantum Sci., https://doi.org/10.1116/5.0060680, 2021.
[15] T.N. Palmer. The Primacy of Doubt. Oxford University Press, 2022.
[16] R. Penrose. The Large, the Small and the Human Mind. Cambridge University Press, 1997.
For all your incredible achievements, David, I have beaten you to one thing, albeit only by a few months: reaching the ripe old age of 70. I have followed your illustrious career since we first got to know each other as graduate students in the Sciama Hut in the 1970s. It’s amazing to reflect on how many of us in the Hut subsequently became Fellows of the Royal Society - our mutual supervisor Dennis Sciama certainly had a golden touch. Indeed this was also a golden age for black hole physics and it was exiting to see ideas around Hawking evaporation and black-hole thermodynamics take shape in real time. We almost wrote a paper together on transfinite causality conditions in relativistic space-times. I suspect the fact we didn’t quite finish the paper reflects badly on me: in truth, I was getting a little outside my zone of mathematical competence. In the end we both completed our theses in somewhat different fields: you on quantum field theory in curved space- time, me on the use of bitensors (more generally tangent-bundle geometry) to formulate a generic quasi-local solution - maybe the first, I’m not sure - to the gravitational energy momentum problem in general relativity.
Whilst Dennis was always a source of inspiration, I suspect his light-touch approach to me was not dissimilar to that for you - he let us follow our own noses. However, I remember one day when he did try and steer me in a certain direction. He came into my office telling me how frustrated he was that Hawking’s paper on black-hole evaporation was so arcane - all those hypergeometric Green’s functions that somehow magically turn out to describe Planckian radiation when evaluated at future null infinity. There must be a simpler way to show this, he said to me, as I am sure he must have said to you. And then one day, Dennis more or less instructed me to find out if the Principle of Maximum Entropy Production was going to provide the route to a more intuitive understanding of Hawking radiation. I didn’t like to tell him I had never heard of said principle, and disappeared into the Radcliffe Science Library to find some dusty old books on non-equilibrium thermodynamics. I got absolutely nowhere with the problem.
However, this episode teed me me up for a life-changing experience. Although I had been successful in applying for a postdoc position to work in Hawking’s group at DAMTP, for a number of reasons I had a slight nagging doubt whether this was the path I should be taking. By complete chance I met Raymond Hide, an ex President of both the Royal Astronomical Society and the Royal Meteorological Society. I asked him what was new in climate science and he told me about a paper which had excited him, showing how properties of Earth’s climate were derivable from the Principle of Maximum Entropy Production. It felt like some higher power was telling me what to do with my life. So, after much angsting, I turned down the Cambridge offer.
As it happens the Principle of Maximum Entropy Production has had as much influence on my climate career as it had on my doctoral research in GR – zero. However, what has had an influence is chaos theory, which I came to learn a lot about after my switch to climate physics. I feel very strongly [15] that the key discovery of MIT meteorologist Ed Lorenz in the 1960s was not so much the butterfly effect (which everyone knows about now, even Gwyneth Paltrow), but the notion that simple sets of nonlinear differential equations – ones that Isaac Newton would have readily understood – could generate a state-space geometry, fractal geometry, that would have been utterly alien to Newton. Indeed, the state-space geometry of these chaotic systems is formally uncomputable [4] and undecidable propositions like Hilbert’s Halting Problem can be reformulated as a problem in fractal geometry [6]. Indeed p-adic numbers, so vital in number theory these days, are to fractal geometry as real numbers are to Euclidean geometry [10].
In 1987, the 300th Anniversary of Newton’s Principia, Stephen Hawking and Werner Israel edited a book comprising a series of contributions by relativists including one by our mutual Oxford teacher and mentor Roger Penrose. I remember reading Roger’s chapter in Blackwell’s in Oxford. Although I had left the rarified heights of GR 10 years earlier, Penrose’s article brought back my old world to me in an instant. Roger’s article mentioned Bell’s Theorem, and I remember thinking that that was something I never really got to the bottom of when I was Dennis’s student. I started reading a few pedagogical books and papers on the subject in my spare time, and realised that applying ideas from the fractal geometry of chaos could lead to a completely novel interpretation of Bell’s Theorem. I published a paper in Proc. Roy. Soc. [12] claiming that one could formulate a locally causal model of quantum spin, and got back to my day job.
From time to time I would try to discuss my ideas with quantum foundations experts, and to my surprise I found it very difficult to persuade them that my approach to the interpretation of Bell’s Theorem was viable. This difficulty continues to the present day. Since I have been thinking about it for so long now, I have been able to refine the argument down to a few basic points. Let me try my idea out on you. Tell me what you think!
So, I claim that quantum physics is underpinned by some deterministic locally causal model. You may respond that if that were so, Bell’s inequality would not be experimentally violated. I will reply that this does not follow because another assumption is needed to derive Bell inequalities. This is usually referred to as Statistical Independence. You might respond (as many do) that violating Statistical Independence would signal into existence some crazy conspiratorial processes (e.g. that the experimenters’ minds are under the control of the particles’ hidden variables) that would effectively signal the end of science as we know it. Well, I’m afraid I have to tell you that I think that argument is complete nonsense and this is why.
First off, I think Statistical Independence is is a lousy description of the technical assumption ρ(λ|XY ) = ρ(λ) in Bell’s Theorem, which I would rather describe as the Counterfactual Definiteness assumption. Here, as you know, λ is a hidden variable and X, Y ∈ {0, 1} are Alice and Bob’s nominal measurement settings.
To see how Counterfactual Definiteness plays a role in Bell’s Theorem, suppose Alice and Bob estimate experimentally the four individual correlations in (the CHSH version of) Bell’s inequality, on Monday, Tuesday, Wednesday and Thursday respectively. Now consider the question: could Monday’s particles have been measured with Tuesday, Wednesday and Thursday’s polariser set- tings? Now manifestly you can’t actually perform these measurements since Monday’s particles were absorbed by the measurement devices on Monday - they aren’t available to be measured on Tuesday or later. That is to say, these are counterfactual measurements, and so the answer to the question depends on whether these counterfactual measurements are consistent with the laws of physics, as encoded in one’s putative deterministic locally causal model of quantum physics. Importantly, it must be that all three of these counterfactual measurements are consistent with the putative deterministic locally causal hidden-variable model, for the model to satisfy the Bell inequality.
A traditional classical hidden variable model assumes a formula which, for a given λ, spits out a spin value for any measurement setting input. For such a model, the outcome of these counterfac- tual measurements are indeed necessarily well defined. That’s what most people think must be the case when they consider hidden-variable models. So my concern about the well definedness of coun- terfactuals leaves them puzzled. What sort of deterministic model could deny the well-definedness of such counterfactuals?
I don’t want to consider a traditional hidden-variable model. Instead I want to consider a non-classical deterministic hidden-variable model. I call this model non-classical because fractal attractors were discovered long after quantum mechanics. So, let’s imagine that the universe is itself a deterministic dynamical system evolving on some fractal chaotic attractor (for the present purposes, a complicated measure-zero dynamically invariant set) – I don’t think there are any cosmological observations that would flat-out contradict such an assumption. If the universe was evolving on such a measure-zero set, then a counterfactual measurement would be inconsistent with the putative laws of physics if the associated counterfactual state of the universe lay in a fractal gap, off the invariant set. What I was able to show with a plausible model consistent with this type of dynamics (see [13] [14] [8] but I am improving the rigour of the argument considerably in a new paper) is that at least one of the three counterfactual measurement scenarios (Monday’s particles – Tuesday, Wednesday or Thursday’s measurement settings) lies off the invariant set and hence is inconsistent with our putative laws of physics.
At this point, you may start to worry whether experimenters have the freedom to choose their measurement settings, according to this model. Experimenters hate to be told they are not free to chose how to do their experiments! As mentioned, they bring up all the old arguments that if they are not free, then the world has to be conspiratorial which they (rightly of course) find unacceptable. My response to this is: Whoa, hold your horses – who said you are not free to choose? In my model, the nominal measurement settings X, Y are completely under the control of the experimenters: for example they may have engineered pseudo-random number generators with buttons which they can press to reveal either a 0 or a 1, or they can look at the Dow Jones index to determine what values of X and Y to use. I don’t care how the choices are made. However, in my model, it is super-vital to distinguish between nominal measurement settings and exact measurement settings. Suppose Alice and Bob press their buttons to set up a run with the X = 0, Y = 0 measurement settings. Because they are nominal settings, there is for all practical purposes an infinite set of possible exact settings consistent with each of the nominal settings. For any particular run of the experiment, the exact settings will be sensitive to the phase of the moons of Jupiter [2] or to any gravitational waves that might be passing.
In my model, whatever the nominal settings X and Y , the exact settings must satisfy what I call a rationality constraint: specifically that the cosine of the angle between Alice’s exact setting and Bob’s exact setting must be a rational number. Importantly, even though they have control over the nominal settings, Alice and Bob have no control over this rationality constraint; it is an unavoidable consequence of my putative law of quantum physics. This condition is not a manifestation of nonlocality because, since it is intrinsic to the putative geometric laws of physics, the rationality constraint cannot be violated by events which happen in space time. That is to say, despite the rationality condition, the model satisfies the local causality condition in space-time, whereby Bob’s spin up/down measurement outcome never depends on Alice’s measurement setting X (and vice versa). Instead, the rationality condition describes a global geometric constraint in state space rather than space-time. This does have major implications for how we should formulate our laws of physics, which I discuss below.
In considering the counterfactual measurements where Alice and Bob selected other than they did (i.e. other than X = 0, Y = 0 in the case above), we keep the particles’ hidden variables fixed and vary X and Y . But what are these hidden variables? In my model, the hidden variables for any one of Alice’s particles correspond to one (from an effectively infinite set of) possible pair of exact settings, relative to the nominal settings X = 0 and X = 1. Similarly for Bob’s particles. To repeat, these hidden variables represent degrees of freedom (like the phase of the moons of Jupiter or gravitational waves from from distant black-hole merger) over which the experimenters have no control whatsoever. Just as the phase of the moons of Jupiter do not force Alice and Bob to choose which button to press, neither do the hidden variables.
If this sounds like a strange idea, think about a newborn baby. Will this baby, in later life, climb Mount Everest or win the Nobel Prize? Some of the information needed to answer these questions will be encoded in the baby’s DNA – that’s a bit like a traditional hidden-variable model where the hidden variables are somehow localised to the particle. But most of the information needed to determine the baby’s achievements won’t be so encoded; they will depend on chance encounters in life as to whether the baby ends up climbing Mount Everest or winning the Nobel Prize. All one can say (assuming a relativistic deterministic world) is that the information needed to answer these questions about the baby lies in the intersection of a spacelike hypersurface going through the birth event, with the interior of the past light cone of the baby’s death event. Similarly here: the support of a particle’s hidden variables at the time the particle is created lies in the intersection of a spacelike hypersurface going through the creation event with the interior of the past light cone of the particle’s measurement event. Again, all locally causal and hunky dory (and, by the way, not requiring any notion of retrocausality).
We now come to a non-trivial pivotal result which is a consequence of number theory, specifically Niven’s Theorem [11] which states that the cosine of a rational (in degrees) angle is almost always irrational. From this and the rationality constraint, we can prove the following: take four points on the sphere which we will call 0A, 0B, 1A and 1B and join the four pairs (0A,0B), (0A,1B), (1B,0A) and (1A,1B) by four great circles. Here 0A corresponds to the exact setting, consistent with Alice’s nominal measurement setting X = 0 and a given fixed value of λ etc. By Niven’s Theorem, it is impossible for the cosine of the four angular distances to all be rational. Just to be clear, it is easy to find three pairs of points (one actual and two counterfactual) which satisfy the rationality constraint. But not four pairs of points. From this we can deduce that at least one of the three counterfactual pairs of measurements that contribute to the CHSH inequality for a fixed λ is inconsistent with the rationality constraint and hence with our putative deterministic locally causal laws of physics. Because of this, it is impossible for our deterministic locally causal theory to satisfy Bell inequality. Once again, I stress that in this model, the selection of the nominal measurement settings is under the full control of the experimenters. There simply are no weird ‘alien mind control’ conspiracies which prevent them from choosing as they like.
Now you may ask, yes this is all very well, but does your model satisfy the Tsirelson bound? I will answer yes it does, and say that the rationality constraint arises when you discretise complex Hilbert Space in a certain way consistent with my invariant set model. That is to say, in my model, the complex Hilbert Space of quantum theory arises as an approximation. Experimentally it is a good approximation, indeed as good as you like by making the descretisation scale sufficiently small. However, theoretically, continuum Hilbert Space is a singular limit [3] – and not a smooth limit – as the discretisation scale does to zero. I won’t dwell on this here as it is something I am trying to write up in the coming weeks. But the bottom line is, yes, my model satisfies the experimentally determined Tsirelson bound.
So what’s the big message behind this interpretation of the violation of Bell inequalities? It is not that the world is indeterministic or nonlocal (i.e. not locally causal). And we don’t need wormholes to somehow short-circuit long-distance correlations. The implications are even more astonishing than ‘mere’ indeterminism, nonlocality, retrocausality or wormholes, since they have major implications for how we should be looking for a theory of quantum gravity. The implications are that what I would call ‘spatial reductionism’ – that to get a more fundamental perspective on the laws of physics, we must look at processes on smaller and smaller scales – may actually be wrong, even though this principle has held us in good stead over past centuries of scientific research. For example, there is a belief that once we can probe the Planck scale experimentally, we’ll finally be able to understand quantum gravity. I don’t believe this. It may instead be that the fundamental laws of quantum gravity are as much based on equations for the state-space geometry of the universe as a whole, as at the Planck scale. In some sense the Planck scale may simply be the Yang as the universe as a whole is the Yin. Put another way, the laws of physics may ultimately turn out to be as much top-down as bottom-up [14]. Julian Barbour [1] and George Ellis [7] think similarly about this (even though they may not endorse the details on my invariant set model) – what is it about us old timers?! (Gosh, how I would love to know how these ideas fit into your Constructor framework. From what I have read and heard, I think there must be some links.)
Looking back at the rather disparate topics that I have worked on over my scientific career, I think the one thing that they have in common is nonlinearity. Whether working on gravitational energy-momentum in GR, discovering the world’s largest (Rossby) breaking waves in the strato- sphere, developing ensemble-based estimates of weather and climate predictability, or proposing this particular interpretation of Bell’s Theorem, nonlinearity has been central to my research career. I do believe strongly that nonlinearity makes things conceptually simple (even though it may make things more computationally difficult).
However, regarding Bell’s Theorem, you and I may disagree. You are a proponent of the Everettian interpretation ([5]), which takes the linear Schr ̈odinger equation quite literally. There are some things about Everett which appeal to me. One area where I agree with you 100% is that the physical resource that quantum computers tap into to get their exponential advantage over classical computers is the physical reality of parallel worlds. However, you may argue that since the Schr ̈odinger equation is linear and is well verified by experiment, how can I claim that the quantum world is nonlinear? I would respond that in classical chaos theory, the Liouville equation for evolution of probability is precisely linear in probability density (as it must be if probability is conserved), even though the probabilities are themselves generated by ensembles of states which individually evolve under nonlinear deterministic dynamics. The close formal similarity between the classical Liouville equation and the Schr ̈odinger equation (for Hamiltonian systems at least) screams out to me that there must be a nonlinear deterministic dynamic underpinning the Schr ̈odinger equation. For that reason, I think that the Everettian interpretation is ultimately wrong: the Schr ̈odinger equation simply cannot be the last word on the subject and should not be taken literally. For example, in invariant set theory, there is no splitting or branching of worlds; they merely diverge exponentially on the invariant set as a result of what we might call decoherence.
After our PhDs (D.Phils) our paths diverged. Your pioneering work on quantum algorithms has created a new field of technology. My work in weather and climate physics, whilst not as groundbreaking, has led to a new way of making predictions which is having an influence around the world in the way humanitarian agencies respond to possible extreme weather events . By having ensemble-based quantitative estimates of forecast uncertainty, they can now decide when to take Anticipatory Action, sending food, medicine shelters and finance ahead of a natural event hitting some region. This is so much better than the old days when these agencies would simply wait for the weather event to hit (because single deterministic forecasts were too unreliable). I describe this, along with my interpretation of Bell’s Theorem, in my popular book The Primacy of Doubt [15], which I hope will be successful, but will never rival the run-away success of The Fabric of Reality [5].
But we may be converging again – I seem to be returning to my graduate-student roots as I get older and older. Didn’t Shakespeare have something to say about that in his ages-of-man speech? Since our days in the Sciama Hut, the fields of foundations of quantum physics, and arguably quantum gravity, have advanced only modestly. A few weeks ago I watched a YouTube video made in 1986 where Dennis Sciama and Ed Witten were discussing the then nascent string theory (Witten had just won the Dirac medal at ICTP). Dennis was sceptical that advances in basic physics could be made based purely on mathematical elegance – a theme developed by Sabine Hossenfelder [9] – and reminded Witten that the original basis for GR was not pseudo-Riemannian geometry but thinking about hypothetical elevators in deep space. Ed clearly felt differently and expressed the hope that quantum theory would somehow be emergent from the elegant mathematics of string theory. Well that hasn’t happened! I think Dennis was bang on the money. I’m glad I made the switch to climate science when I did, as I suspect, had I gone to Cambridge, I would have got embroiled in the nitty mathematical gritty of string theory’s predecessor supergravity, or something like that, and I would not have made any significant progress at all. After all, I didn’t know anything about fractal geometry in those days – it took a switch to an applied-science field to learn it. You have been successful by also decoupling yourself from mainstream academia. Good on you! Is there something to learn from this?
I’ll leave you with one of my favourite quotes from the always-inspirational Roger Penrose [16] - who, as you well know, became a relativist (and as a result won the Nobel Prize) by meeting Dennis in Cambridge:
My own view is that to understand [so-called] quantum non-locality we shall require a radical new
theory. This theory will not just be a slight modification of quantum mechanics but something as
different from standard quantum mechanics as General Relativity is different from Newtonian gravity.
Yep! I agree with that. You? Happy Birthday, David.
References
[1] J. Barbour. Quantum without Quantum. This Volume, 2023.
[2] J.S. Bell. Free variables and local causality. Dialectica, 39:103, 1985.
[3] M. V. Berry. Singular limits. Physics Today, 55:10–11, 2002.
[4] L. Blum, F.Cucker, M.Shub, and S.Smale. Complexity and Real Computation. Springer, 1997.
[5] D. Deutsch. The Fabric of Reality. Penguin Books, 1998.
[6] S. Dube. Undecidable problems in fractal geometry. Complex Systems, 7:423–444, 1993.
[7] G.F.R. Ellis. Top-down causation and quantum physics. Proceedings of the National Academy of Sciences, 115:11661=11663, 2018.
[8] Jonte R. Hance, Sabine Hossenfelder, and Tim N. Palmer. Supermeasured: Violating bell- statistical independence without violating physical statistical independence. Foundations of Physics, 52(4):81, Jul 2022.
[9] S. Hossenfelder. Lost in Math. Basic Books, 2018.
[10] S. Katok. p-adic Analysis compared with Real. American Mathematical Society, 2007.
[11] I. Niven. Irrational Numbers. The Mathematical Association of America, 1956.
[12] T.N. Palmer. A local deterministic model of quantum spin measurement. Proc. Roy. Soc., A451:585–608, 1995.
[13] T.N. Palmer. Discretization of the Bloch sphere, fractal invariant sets and Bell’s theorem. Proc. Roy. Soc., https://doi.org/10.1098/rspa.2019.0350, arXiv:1804.01734, 2020.
[14] T.N. Palmer. Bell’s theorem, non-computabiity and conformal cyclic cosmology: A top-down approach to quantum gravity. AVS Quantum Sci., https://doi.org/10.1116/5.0060680, 2021.
[15] T.N. Palmer. The Primacy of Doubt. Oxford University Press, 2022.
[16] R. Penrose. The Large, the Small and the Human Mind. Cambridge University Press, 1997.