The latest news on superdeterminism

Foundations of physics and/or philosophy of physics, and in particular, posts on unresolved or controversial issues

The latest news on superdeterminism

Postby gill1109 » Sun Aug 22, 2021 10:40 am

https://arxiv.org/abs/2108.07292
Supermeasured: Violating Statistical Independence without violating statistical independence

J.R. Hance, S. Hossenfelder, T.N. Palmer
Bell's theorem is often said to imply that quantum mechanics violates local causality, and that local causality cannot be restored with a hidden-variables theory. This however is only correct if the hidden-variables theory fulfils an assumption called Statistical Independence. Violations of Statistical Independence are commonly interpreted as correlations between the measurement settings and the hidden variables (which determine the measurement outcomes). Such correlations have been discarded as "finetuning" or a "conspiracy". We here point out that the common interpretation is at best physically ambiguous and at worst incorrect. The problem with the common interpretation is that Statistical Independence might be violated because of a non-trivial measure in state space, a possibility we propose to call "supermeasured". We use Invariant Set Theory as an example of a supermeasured theory that violates the Statistical Independence assumption in Bell's theorem without requiring correlations between hidden variables and measurement settings.
gill1109
Mathematical Statistician
 
Posts: 2812
Joined: Tue Feb 04, 2014 10:39 pm
Location: Leiden

Re: The latest news on superdeterminism

Postby Joy Christian » Sun Aug 22, 2021 10:45 am

.
Are you trying to spam the forum? You already started a thread on this paper just a few days ago. Or are you becoming forgetful in your old age?
.
Joy Christian
Research Physicist
 
Posts: 2793
Joined: Wed Feb 05, 2014 4:49 am
Location: Oxford, United Kingdom

Re: The latest news on superdeterminism

Postby gill1109 » Sun Aug 22, 2021 11:07 am

Joy Christian wrote:.
Are you trying to spam the forum? You already started a thread on this paper just a few days ago. Or are you becoming forgetful in your old age?

I forget lots of things. It means I can keep re-watching old movies.

I wanted to discuss this paper again since I’m beginning to understand what these folk are doing. There was a lot of discussion about it on FaceBook today and quite a few people got pretty upset. It’s intriguing. I’m hoping someone here can explain “invariant set theory” for me. But I also think there is a fundamental misunderstanding of Bell’s theorem in the paper. So I do have some new “news” on the subject. Maybe tomorrow.
gill1109
Mathematical Statistician
 
Posts: 2812
Joined: Tue Feb 04, 2014 10:39 pm
Location: Leiden

Re: The latest news on superdeterminism

Postby Justo » Sun Aug 22, 2021 11:49 am

gill1109 wrote:
Joy Christian wrote:.
Are you trying to spam the forum? You already started a thread on this paper just a few days ago. Or are you becoming forgetful in your old age?

I forget lots of things. It means I can keep re-watching old movies.

I wanted to discuss this paper again since I’m beginning to understand what these folk are doing. There was a lot of discussion about it on FaceBook today and quite a few people got pretty upset. It’s intriguing. I’m hoping someone here can explain “invariant set theory” for me. But I also think there is a fundamental misunderstanding of Bell’s theorem in the paper. So I do have some new “news” on the subject. Maybe tomorrow.

Richard, I know you have made important contributions to the field. That is really surprising given that you do not understand the very easy physical principles underlying the inequality. I guess that shows the power of mathematics.
I am sure you can still contribute to the field if you just recognize you were mistaken about those physical principles instead of stubbornly insisting on nonsense. I guess that is the c***pot side of you that is betraying you.
Last edited by FrediFizzx on Sun Aug 22, 2021 11:53 am, edited 1 time in total.
Reason: We don't use that term on this forum
Justo
 
Posts: 83
Joined: Fri Aug 20, 2021 8:20 am

Re: The latest news on superdeterminism

Postby gill1109 » Sun Aug 22, 2021 8:37 pm

Justo wrote:
gill1109 wrote:
Joy Christian wrote:.
Are you trying to spam the forum? You already started a thread on this paper just a few days ago. Or are you becoming forgetful in your old age?

I forget lots of things. It means I can keep re-watching old movies.

I wanted to discuss this paper again since I’m beginning to understand what these folk are doing. There was a lot of discussion about it on FaceBook today and quite a few people got pretty upset. It’s intriguing. I’m hoping someone here can explain “invariant set theory” for me. But I also think there is a fundamental misunderstanding of Bell’s theorem in the paper. So I do have some new “news” on the subject. Maybe tomorrow.

Richard, I know you have made important contributions to the field. That is really surprising given that you do not understand the very easy physical principles underlying the inequality. I guess that shows the power of mathematics.
I am sure you can still contribute to the field if you just recognize you were mistaken about those physical principles instead of stubbornly insisting on nonsense. I guess that is the c***pot side of you that is betraying you.

Justo, of course I understand the three physical principles behind CHSH. These authors *change* one of the three principles by changing the definition of statistical independence, and hence CHSH no longer has to be true. They seem to think that their changed principle is just as good as the original. However I think it has no physical justification whatsoever. It is an artificial mathematical variation on the usual conditions which makes the theorem untrue, but so what?

I think they are confused about statistical independence. They have forgotten about the distinction between a probability measure and a probability density. That a probability density factors is a necessary condition for statistical independence if the dominating measure is a product measure. But they use dominating measures which depend on *both* settings.

So the first part of the paper is just muddled. The second part, the mathematically advanced part, is very difficult and I would like to understand it. However, I do not think it is relevant to Bell’s theorem because, by using their trick to have non local dominating measures, their final model is not local. They have a joint probability distribution of four variables: two settings and two outcomes. It is constructed using some very advanced mathematics, which I want to understand. It has no relevance to Bell experiments when inputs and outputs are subject to the standard spatio-temporal restrictions.
gill1109
Mathematical Statistician
 
Posts: 2812
Joined: Tue Feb 04, 2014 10:39 pm
Location: Leiden

Re: The latest news on superdeterminism

Postby gill1109 » Mon Aug 23, 2021 1:42 am

Their text is fascinating: "Measure theory is not usually discussed in physics textbooks. However, a variety of measures make their appearance in physics nevertheless. The most widely used one is the Lebesgue measure on R^n and (pseudo-)Riemannian manifolds. On fractals it can be generalised to the Hausdorff measure. In the context of Hamiltonian dynamical systems, a non-trivial measure on state-space arises in the theory of symplectic manifolds (leading, for example, to the Gromov non-squeezing theorem). In Section IV A, we discuss non-trivial invariant measures associated with chaotic attractors. The measure of "script S_math" appears in the calculation of any expectation value and therefore should enter the derivation of Bell’s theorem together with the probability-distribution ρ. Since these two functions always appear together, it is tempting to simply combine them into one ρ_Bell(λ, X) := ρ(λ, X)µ(λ, X), where we use the index “Bell” to emphasise that this is the quantity that really enters Bell’s theorem."

Yes! It is indeed te quantity that really enters into Bell's theorem. It is not only tempting, but it is also necessary!!!! Bell even said himself that really they should be combined, he was just using lazy physicists' (who don't know measure theory) common notation!!!! Bell allows any fancy state-space too. Explicitly. "lambda" doesn't have to be localized anywhere. It can be as weird and abstract as you like. Bell has said all these things 40 years ago but it seems that many contrary minds only read his first one or two papers and then stop reading, because they are already sure he is wrong, because they did not understand what they read, because they are entrapped in old-fashioned ways of thinking and using inadequate notations and unaware of the progress of mathematics in the last one hundred years. Kolmogorov made probability into a serious part of mainstream hard-core mathematics in 1933. He needed Western catch to renovate his dacha so he published his little book in German thereby answering one of Hilbert's problems. He could do this thanks to the Radon-Nikodym theorem which allowed him to put conditional probability firmly into mathematics, too. Borel and Lebesgue and others had already done a great deal, thirty years before that, but the Radon-Nikodym theorem allowed Kolmogorov's breakthrough.

Notice that their terminology Is wrong. Bell's rho is not a probability distribution, it is a probability density.
gill1109
Mathematical Statistician
 
Posts: 2812
Joined: Tue Feb 04, 2014 10:39 pm
Location: Leiden

Re: The latest news on superdeterminism

Postby Justo » Mon Aug 23, 2021 5:48 am

gill1109 wrote:Their text is fascinating: "Measure theory is not usually discussed in physics textbooks. However, a variety of measures make their appearance in physics nevertheless. The most widely used one is the Lebesgue measure on R^n and (pseudo-)Riemannian manifolds. On fractals it can be generalised to the Hausdorff measure. In the context of Hamiltonian dynamical systems, a non-trivial measure on state-space arises in the theory of symplectic manifolds (leading, for example, to the Gromov non-squeezing theorem). In Section IV A, we discuss non-trivial invariant measures associated with chaotic attractors. The measure of "script S_math" appears in the calculation of any expectation value and therefore should enter the derivation of Bell’s theorem together with the probability-distribution ρ. Since these two functions always appear together, it is tempting to simply combine them into one ρ_Bell(λ, X) := ρ(λ, X)µ(λ, X), where we use the index “Bell” to emphasise that this is the quantity that really enters Bell’s theorem."

Yes! It is indeed te quantity that really enters into Bell's theorem. It is not only tempting, but it is also necessary!!!! Bell even said himself that really they should be combined, he was just using lazy physicists' (who don't know measure theory) common notation!!!! Bell allows any fancy state-space too. Explicitly. "lambda" doesn't have to be localized anywhere. It can be as weird and abstract as you like. Bell has said all these things 40 years ago but it seems that many contrary minds only read his first one or two papers and then stop reading, because they are already sure he is wrong, because they did not understand what they read, because they are entrapped in old-fashioned ways of thinking and using inadequate notations and unaware of the progress of mathematics in the last one hundred years. Kolmogorov made probability into a serious part of mainstream hard-core mathematics in 1933. He needed Western catch to renovate his dacha so he published his little book in German thereby answering one of Hilbert's problems. He could do this thanks to the Radon-Nikodym theorem which allowed him to put conditional probability firmly into mathematics, too. Borel and Lebesgue and others had already done a great deal, thirty years before that, but the Radon-Nikodym theorem allowed Kolmogorov's breakthrough.

Notice that their terminology Is wrong. Bell's rho is not a probability distribution, it is a probability density.


Richard, when I said that you do not understand the physics involved in the Bell theorem I was meaning that someone who derives it from CFD (unfortunately most physicists except perhaps experts in foundations) can't understand it.
I give you one clue, using CFD you can derive the inequality without hidden variables. Hidden variables is the main physical idea involved in all this, what does it mean to have a Bell inequality without hidden variables?
There is no way out of the irrelevance of the CFD assumption because even assuming it makes sense, you can completely ignore it and derive the inequality from physically meaningful assumptions, i.e., Local Causality(LC) and Statistical Independence(SI). The issue is so simple that even someone like me who does not know probability theory can do it. All you need to know is the intuitive meaning of probability as a relative frequency.
Since Bell left all these trivialities implicit because he concentrated on the important points, I give a detailed explanation of how LC and SI naturally describe the Bell experiment without introducing metaphysics.
All you need is to count events, record them, and evaluate relative frequencies. That explains why the results of four different sets of experiments can be reduced under the same sum with equal hidden variables.
If you say I am wrong then I guess you would agree with @minkwe. Basically, he says that I am wrong about the following: let us assume we have a great number of cards with 16 different values 1,2,...16 (for the sake of simplicity let us say the number of cards with different values is the same) My claim is that when you extract (with replacement) one card more than 16 times the values you choose will necessarily start to repeat and if you calculate the relative frequency of each extracted value after a great number of trials, the relative frequency of each extracted number should be approximately 1/16.

That is all the Bell theorem is about. A freaking triviality!
Some time ago I read a paper in the prestigious journal PRA uttering so many absurdities about the Bell theorem that I could not help myself from writing a comment. To my surprise, the comment was accepted a while ago and still is not appearing as published. I won't be surprised if they don't finally publish it.
Justo
 
Posts: 83
Joined: Fri Aug 20, 2021 8:20 am

Re: The latest news on superdeterminism

Postby gill1109 » Mon Aug 23, 2021 7:56 am

Justo: I agree the Bell theorem is trivial maths. My favourite proof of it is my own proof in this paper: https://arxiv.org/abs/quant-ph/0204169
Comment on "Exclusion of time in the theorem of Bell" by K. Hess and W. Philipp
R.D. Gill, G. Weihs, A. Zeilinger, M. Zukowski

A recent Letter by Hess and Philipp claims that Bell's theorem neglects the possibility of time-like dependence in local hidden variables, hence is not conclusive. Moreover the authors claim that they have constructed, in an earlier paper, a local realistic model of the EPR correlations. However, they themselves have neglected the experimenter's freedom to choose settings, while on the other hand, Bell's theorem can be formulated to cope with time-like dependence. This in itself proves that their toy model cannot satisfy local realism, but we also indicate where their proof of its local realistic nature fails.

Journal reference: Europhys. Lett. (2003) 61, 282-283
DOI: 10.1209/epl/i2003-00230-6

Here I have an explicit assumption which I here call *realism*. In other papers, following Tsirelson and others, I call it “counterfactual definiteness”. Modern statistical study of causality uses the same concept to great effect. Did you study Judea Pearl’s book yet?

I think that locality plus CFD (my notion) is mathematically equivalent to LHV (local hidden variables).

It seems to me we merely disagree concerning use of some words. CFD means something different to you than it means to me and to the authors whom I know and respect.

Perhaps you have a different notion of “random variable” to me. For me, it is a measurable function from a probability space (Omega, F, P) to the real line endowed with the Borel sigma algebra. It’s a mathematical object in a mathematical model.

I think physicists confuse mathematical models of reality with reality itself, since they already use mathematics to describe reality. Statisticians know “all models are wrong, some are useful”. Poor physicists… Then there are the poor philisophers who are mostly discussing words.
gill1109
Mathematical Statistician
 
Posts: 2812
Joined: Tue Feb 04, 2014 10:39 pm
Location: Leiden

Re: The latest news on superdeterminism

Postby Justo » Mon Aug 23, 2021 8:39 am

gill1109 wrote:Perhaps you have a different notion of “random variable” to me. For me, it is a measurable function from a probability space (Omega, F, P) to the real line endowed with the Borel sigma algebra. It’s a mathematical object in a mathematical model.

I think physicists confuse mathematical models of reality with reality itself, since they already use mathematics to describe reality. Statisticians know “all models are wrong, some are useful”. Poor physicists… Then there are the poor philisophers who are mostly discussing words.

It's not enough trying to ridicule physicits and philosiphers. I' m no physicist nor philosopher nor mathematician. The man on street with a basic education can understand the Bell theorem.
So, you have missed my point. You do not need mathematically rigorous concepts to inderstand the Bell theorem, like a formal definition of random variable or probability. In fact, the probability concept proved useful long before its very late formalization in the 1930s.
Justo
 
Posts: 83
Joined: Fri Aug 20, 2021 8:20 am

Re: The latest news on superdeterminism

Postby Austin Fearnley » Mon Aug 23, 2021 10:44 am

Richard has mentioned a paper about time wrt LHVs. I had been thinking about this some months back and came to a conclusion. Not sure that I like the conclusion though. In a recent paper I showed that one can model Malus’s law using hidden variables which are more than the simple polarisation vectors. Consider Feynman’s analogy of particles having individual clocks representing phase angles. And the total time on the clock is less important than the phase. So 2 a.m and 2 p.m. are equivalent on a 12 hour analogue clock dial. In other words it’s a modular system. This may be similar to the need for modular p-adic rationals in Palmer’s superdeterminism, but who knows?

Anyway, to continue, Feynman showed that the phases were important in determining where light waves cancelled and reinforced and IMO that means that the phases would be important in determining Alice’s measurement outcomes. Phase is clearly time-related given the clock analogy. My earliest LHVs were simple vectors and those corresponded to polarisation vectors. I could not model Malus’s law using polarisation vectors alone as LHVs. This corresponds to Alice’s measurements as a function of a and p where p is a particle’s fixed, polarisation vector. In this scenario one does not have CFD nor does one have determinism. But that is because polarisation vectors on their own are insufficient to act as LHVs.

Susskind in online lectures said that nature is deterministic. So clearly we need to aim for LHVs which allow full determinism. So we need Bob’s measurement outcomes to be a function of (say) b, p, and t, where t is time. This function worked to model Malus’s law successfully. What I do not like is that this appears, according to Sabine Hossenfelder, to be a determinism which denies the existence of free will! Bring on the block universe! (Time in my function is periodic/ modular like the phase clock.)

My simulations normally use only a=0 and b=45 degrees. Very simple. In my simple scenario, CFD does not really apply but in my context I can use ‘deterministic’ instead of ‘CFD’.

Next, onto Justo’s point. I cannot make a fair simulation forwards-in-time using the above LHV functions of ( a, p and t ) and (b, p and t.). But I can make a very unfair forwards-in-time simulation which gives a Bell simulated correlation of 0.707. To do this, let half of the p polarisation vectors be equal to +a and half be equal to -a. This Bell simulation defaults to a Malus-type simulation which also gives the Bell correlation of 0.707. But this is a complete cheat. Equivalent to data pruning. The p values ought to be distributed in the simulation to match what we think happens in nature.

I did eventually obtain the 0.707 Bell correlation using a fair distribution of p values, plus modular t, plus retrocausal effects on the positrons.
Austin Fearnley
 

Re: The latest news on superdeterminism

Postby Justo » Mon Aug 23, 2021 4:17 pm

@Austin. You seem to be claiming to have a counterexample to the Bell theorem. Are you aware of a recently published paper by Eugen Muchowski claiming the same?
With respect to time dependence, I used to believe that the inclusion of time can be dismissed on the assumption of time homogeneity and that explicit inclusion of time dependency would violate the BI. Later I realized that is not the case, is almost impossible to spoil the inequality without violating one of its hypotheses.
Justo
 
Posts: 83
Joined: Fri Aug 20, 2021 8:20 am

Re: The latest news on superdeterminism

Postby minkwe » Mon Aug 23, 2021 10:10 pm

Justo wrote:There is no way out of the irrelevance of the CFD assumption because even assuming it makes sense, you can completely ignore it and derive the inequality from physically meaningful assumptions, i.e., Local Causality(LC) and Statistical Independence(SI). The issue is so simple that even someone like me who does not know probability theory can do it. All you need to know is the intuitive meaning of probability as a relative frequency.

Actually, you can't. I just showed you that all derivations go through a 4xN spreadsheet. You either arrive at the spreadsheet through CFD as Gill does, or arrive at it through an assumption that the data from the experiment can be reordered and reduced. Bell's theorem is caught between a rock and a hard place. No escape.

Since Bell left all these trivialities implicit because he concentrated on the important points, I give a detailed explanation of how LC and SI naturally describe the Bell experiment without introducing metaphysics.

I think you are reading much more into Bell's thoughts and feelings beyond what is in his papers. It's like you want him to be right.

All you need is to count events, record them, and evaluate relative frequencies. That explains why the results of four different sets of experiments can be reduced under the same sum with equal hidden variables.

They can't be reduced. I've explained to you why they can't. And you understand it. Until you have a good argument against my explanation, it's not up-and-up to keep repeating the claim. It's a false claim. Saying we agree to disagree when you haven't explained anything to even be disagreed with is not up-and-up either.

If you say I am wrong then I guess you would agree with @minkwe. Basically, he says that I am wrong about the following: let us assume we have a great number of cards with 16 different values 1,2,...16 (for the sake of simplicity let us say the number of cards with different values is the same) My claim is that when you extract (with replacement) one card more than 16 times the values you choose will necessarily start to repeat and if you calculate the relative frequency of each extracted value after a great number of trials, the relative frequency of each extracted number should be approximately 1/16.

Are you serious, this is absolutely not what I'm saying. You are absolutely correct about the relative frequencies above. You need to go back and read the other thread. Pay particular attention to the repeated mention that the Fair Sampling assumption is granted. I assume you understand what that means.

Here is what I'm saying, using a similar analogy. We have two boxes of cards labelled "1", "2". Each box has the same distribution of card values. There are two methods of picking cards "a", "b", each method is biased in a different way. Obviously, using the same method to pick N pairs from the same box will yield samples that have a practically identical distribution of values if N is large.

Now perform an experiment with 4 couples. Each person is assigned a box and a method to use in picking cards. The couples pick pairs of cards with replacement each time. Each person always picks from the same assigned box using the same assigned method. The couple records their values in a 2xN spreadsheet.
The assigned boxes and methods for the couples in the experiment are
(a1,b1), (a1,b2), (a2,b1) and (a2,b2). Where a1 means picked from box "a" using method "1". Obviously, the distributions of values is the a1 columns are almost identical.
That is the Fair sampling assumption, absolutely not at issue. But the exact sequence of values in the a1 columns are different.
Now please go read my argument in the other thread. You should be able to complete the argument from this point.

What I'm saying is that the data from this experiment cannot be reordered and reduced from four independent 2xN spreadsheets into one 4xN spreadsheet. This is a simple exercise that anyone with Excel or OpenOffice can verify. The reordering is required in order for the derivation in your paper to proceed. It is an implicit assumption in the derivation, an assumption that turns out to be false.
minkwe
 
Posts: 1441
Joined: Sat Feb 08, 2014 10:22 am

Re: The latest news on superdeterminism

Postby Justo » Tue Aug 24, 2021 1:36 am

@minkwe, it would be nice if you write a comment to my paper explaining its mistakes thus proving that CFD is the only way the inequality can be derived.
One author that I cited already emailed me saying that he will do so. I told him that I would welcome his comment. I don't know why people take criticism personally, at least when it is objective and does include include prsonal attacks.
Justo
 
Posts: 83
Joined: Fri Aug 20, 2021 8:20 am

Re: The latest news on superdeterminism

Postby Austin Fearnley » Tue Aug 24, 2021 3:26 am

Hi Justo

No, I do not claim to have a counter example to Bell's Theorem (though the use of 'counter example' seems vague to me). If I had, I would have tried to claim Richard's prize money (the forum thread for that prize money is now on the second page of this site). I discussed my retrocausal method with Richard on that 'prize' thread. I never intended to make a claim and only used that thread as it was convenient for me at the time. I can get the 0.707 correlation using retrocausality but my model bypasses the Bell Inequalities and does not break them. Clearly the use of retrocausality is, quite understandably, not likely to persuade Richard to empty his pockets ... even of loose change. But IMO retrocausality is what is happening in real experiments.

Likewise I do not believe that superdeterminism can break the Bell inequalities. It may be able to get the 0.707 correlation, though I have not seen evidence of that. But IMO it will need to bypass the Bell Inequalities by using unfair sampling of particle LHVs. On the other hand the universe should determine what a fair sampling is, and if the universe is not providing a random allocation of possible LHV values, then who can say that a simulation is not using a fair sample? AFAIK there are no computer simulations offered for superdeterminism?

The paper by Eugen (aka Esail) is discussed on a nearby thread and I have already made comments there although not on the recent pages. I also have the opinion that Esail uses non-local formulae.
Austin Fearnley
 

Re: The latest news on superdeterminism

Postby gill1109 » Wed Aug 25, 2021 9:52 am

Austin Fearnley wrote:Hi Justo

No, I do not claim to have a counter example to Bell's Theorem (though the use of 'counter example' seems vague to me). If I had, I would have tried to claim Richard's prize money (the forum thread for that prize money is now on the second page of this site). I discussed my retrocausal method with Richard on that 'prize' thread. I never intended to make a claim and only used that thread as it was convenient for me at the time. I can get the 0.707 correlation using retrocausality but my model bypasses the Bell Inequalities and does not break them. Clearly the use of retrocausality is, quite understandably, not likely to persuade Richard to empty his pockets ... even of loose change. But IMO retrocausality is what is happening in real experiments.

Likewise I do not believe that superdeterminism can break the Bell inequalities. It may be able to get the 0.707 correlation, though I have not seen evidence of that. But IMO it will need to bypass the Bell Inequalities by using unfair sampling of particle LHVs. On the other hand the universe should determine what a fair sampling is, and if the universe is not providing a random allocation of possible LHV values, then who can say that a simulation is not using a fair sample? AFAIK there are no computer simulations offered for superdeterminism?

The paper by Eugen (aka Esail) is discussed on a nearby thread and I have already made comments there although not on the recent pages. I also have the opinion that Esail uses non-local formulae.

Here’s an interesting paper on retrocausality:
https://arxiv.org/abs/1508.01140
Disentangling the Quantum World

Huw Price, Ken Wharton

Correlations related to related to quantum entanglement have convinced many physicists that there must be some at-a-distance connection between separated events, at the quantum level. In the late 1940s, however, O. Costa de Beauregard proposed that such correlations can be explained without action at a distance, so long as the influence takes a zigzag path, via the intersecting past lightcones of the events in question. Costa de Beauregard's proposal is related to what has come to be called the retrocausal loophole in Bell's Theorem, but -- like that loophole -- it receives little attention, and remains poorly understood. Here we propose a new way to explain and motivate the idea. We exploit some simple symmetries to show how Costa de Beauregard's zigzag needs to work, to explain the correlations at the core of Bell's Theorem. As a bonus, the explanation shows how entanglement might be a much simpler matter than the orthodox view assumes -- not a puzzling feature of quantum reality itself, but an entirely unpuzzling feature of our knowledge of reality, once zigzags are in play.

Journal reference: Entropy, v17, 7752-7767 (2015)
DOI: 10.3390/e17117752
gill1109
Mathematical Statistician
 
Posts: 2812
Joined: Tue Feb 04, 2014 10:39 pm
Location: Leiden

Re: The latest news on superdeterminism

Postby Austin Fearnley » Thu Aug 26, 2021 6:06 am

I do not like the paper!

IMO zigzag motion of particles is not happening. But it does represent the apparent motion. In my preon model, positrons travel always backwards in time and electrons travel always forwards in time.

In this wiki diagram:
https://commons.wikimedia.org/wiki/File:Feynmann_Diagram_Gluon_Radiation.svg<BR><BR>Joel%20Holdsworth%20(Joelholdsworth),%20Public%20domain,%20via%20Wikimedia%20Commons
the appearance of a bounce back in time in the diagram is unfortunate IMO. The wavy line is a photon which carries away some preons from both the electron and positron so the positron is not simply an electron bouncing back in time. Unfortunately, according to my preon model, the diagram is wrong. There is a similar problem with the emission of a photon from an electron as weak isospin is not conserved in that emission. At least it is not conserved in a Feynman diagram for a particle path flowchart. QFT presumably gives a better treatment by taking into account the weak isospin as a field effect. I have been putting this off but it needs me to write another paper in my preon paper series. It is very important for retrocausality as I need to explain that ‘antiphotons’ can travel backwards in time as well as positrons. That removes the need in the Price &tc paper for worry about asymmetry between Alice and Bob.

As a taster, what really happens IMO in electron/ positron pair creation is:
Photon + Higgs (field effect)—> electron + positron
Where either the photon or the Higgs is going backwards in time.

And for photon emission,
Electron + Higgs —> electron + photon
Where the photon and Higgs are either both forwards in time or both backwards.

The section on Alice in the mirror is awful IMO. It is far too complicated for what is a very simple matter. Retrocausality is almost trivially simple.

The attempt to get symmetry for Alice and Bob is unnecessary in my model as there is asymmetry in nature. Only the positron is travelling backwards. That is the asymmetry. But Alice and Bob receive equal numbers of positrons so Alice and Bob do find an overall balancing out. It is different when using photons though, but in a photon pair, one of them is travelling backwards in time.

No-signalling? I tried to find a signalling method with my model but failed to achieve it. Not absolutely sure that it is impossible.

Superdeterminism? There are commonalities between retrocausality and superdeterminism. Superdeterminism needs a transaction between emission of a particle here and now with a need for ‘permission’ to interact somewhere (somehow agreed) in the future. Retrocausality is much more simple but it does require a similar transaction between a positron travelling backwards in time from Alice to the oven, in this case, where it will interact to cause the emission of the electron partner.

I am an amateur at physics but there is a lot from Feynman about negotiations between start (past) and end (future) points in a particle path.

Enough for now.
Austin Fearnley
 

Re: The latest news on superdeterminism

Postby minkwe » Thu Aug 26, 2021 7:59 am

There is content there that has nothing to do with "retrocausality" or "superdeterminism". As usually the authors get drawn into the mysticism and present perfectly reasonable ideas in ways that detract from the content. This paper could have been written in a way that did not mention superdeterminism or retrocausality at all, and would have been more useful. Surely it would have been harder to publish but it would have been more useful.
minkwe
 
Posts: 1441
Joined: Sat Feb 08, 2014 10:22 am


Return to Sci.Physics.Foundations

Who is online

Users browsing this forum: ahrefs [Bot] and 68 guests

cron
CodeCogs - An Open Source Scientific Library