99 posts
• Page **2** of **5** • 1, **2**, 3, 4, 5

For completeness, I have added CHSH inequality - taking a million of random amplitudes, in ~0.5% of cases there is exceeded 2 value, what is impossible classically.

Mathematica file: https://www.dropbox.com/s/a0n0kb8cqazgz2j/CHSH.nb

This is for MERW ( https://en.wikipedia.org/wiki/Maximal_E ... andom_Walk ): assuming uniform (or Boltzmann) distribution among paths, what can be seen as QM in imaginary time.

It has Born rule: pr ~ psi^2, where one psi comes from ensemble of past paths, the other from future (this is time symmetric model).

To violate Bell-like inequalities (derived without such square), we need to design situations where we first add amplitudes of unmeasured variables, then perform the square.

Here is such construction for P(A=B) + P(B=C) + P(A=C) >=1 from page 9 of updated https://arxiv.org/pdf/0910.2724

Mathematica file: https://www.dropbox.com/s/a0n0kb8cqazgz2j/CHSH.nb

This is for MERW ( https://en.wikipedia.org/wiki/Maximal_E ... andom_Walk ): assuming uniform (or Boltzmann) distribution among paths, what can be seen as QM in imaginary time.

It has Born rule: pr ~ psi^2, where one psi comes from ensemble of past paths, the other from future (this is time symmetric model).

To violate Bell-like inequalities (derived without such square), we need to design situations where we first add amplitudes of unmeasured variables, then perform the square.

Here is such construction for P(A=B) + P(B=C) + P(A=C) >=1 from page 9 of updated https://arxiv.org/pdf/0910.2724

- Jarek
**Posts:**116**Joined:**Tue Dec 08, 2015 1:57 am

Jarek, maybe you would like to raise these issues at the Google-Group on Bell inequalities.

There is another thread on the present forum giving some more information about it

There is another thread on the present forum giving some more information about it

- gill1109
- Mathematical Statistician
**Posts:**1479**Joined:**Tue Feb 04, 2014 10:39 pm**Location:**Leiden

Sure, but I have a feeling that there are the same people as here, so we can as well discuss it in this open forum.

To summarize, we have two philosophies:

1) Standard, intuitive: probability of alternative of disjoint events is sum of their probabilities,

2) Born rules (QM, MERW): probability of alternative of disjoint events is proportional to sum of squares of their amplitudes.

Bell-like inequalities are derived using 1), hence it shouldn't be a surprise that models using 2) can violate them.

While we don't have a complete understanding of QM, we need simpler models having its crucial properties, especially Born rules leading to Bell violation.

MERW is such trivial model - just uniform distribution among paths - that's all, Feynman path integrals but after Wick rotation ... and it already gives clear intuition where Born rules come from: time symmetry (analogously to https://en.wikipedia.org/wiki/Two-state ... _formalism )... or Anderson localization (standard diffusion models only approximate the maximal entropy principle required by statistical physics models) - I would gladly discuss it here or somewhere else.

To summarize, we have two philosophies:

1) Standard, intuitive: probability of alternative of disjoint events is sum of their probabilities,

2) Born rules (QM, MERW): probability of alternative of disjoint events is proportional to sum of squares of their amplitudes.

Bell-like inequalities are derived using 1), hence it shouldn't be a surprise that models using 2) can violate them.

While we don't have a complete understanding of QM, we need simpler models having its crucial properties, especially Born rules leading to Bell violation.

MERW is such trivial model - just uniform distribution among paths - that's all, Feynman path integrals but after Wick rotation ... and it already gives clear intuition where Born rules come from: time symmetry (analogously to https://en.wikipedia.org/wiki/Two-state ... _formalism )... or Anderson localization (standard diffusion models only approximate the maximal entropy principle required by statistical physics models) - I would gladly discuss it here or somewhere else.

- Jarek
**Posts:**116**Joined:**Tue Dec 08, 2015 1:57 am

ps. Entire derivation of Born rules in MERW - just uniform distribution of paths (on graph with adjacency matrix M) - slide 7 here or https://en.wikipedia.org/wiki/Maximal_E ... andom_Walk :

In rho(x) ~ psi(x)^2, one psi comes from ensemble of past half-paths, the second from future half-paths.

Like unitary evolution in QM or CPT symmetry, this is a time symmetric model ... what is very nonintuituitive: we think in asymmetric past->future causality setting, leading us to Bell-like inequalities ... which can be violated as we live CPT-symmetric physics instead.

In rho(x) ~ psi(x)^2, one psi comes from ensemble of past half-paths, the second from future half-paths.

Like unitary evolution in QM or CPT symmetry, this is a time symmetric model ... what is very nonintuituitive: we think in asymmetric past->future causality setting, leading us to Bell-like inequalities ... which can be violated as we live CPT-symmetric physics instead.

- Jarek
**Posts:**116**Joined:**Tue Dec 08, 2015 1:57 am

But as we told you before, you are not really violating the Bell inequalities because it is mathematically impossible for anything to violate them. If you are exceeding the bounds of the inequalities, then you are using a different inequality with a higher bound.

- FrediFizzx
- Independent Physics Researcher
**Posts:**1559**Joined:**Tue Mar 19, 2013 7:12 pm**Location:**N. California, USA

FrediFizzx, please elaborate.

So let's take the simplest of such inequalities - drawing three coins, at least two give the same value:

P(A=B) + P(B=C) + P(A=C) >=1

Absolutely obvious, looks impossible to get otherwise ... but there is QM setting allowing to get 0.75 sum instead ( https://arxiv.org/pdf/1212.5214 ).

For CHSH there is experimentally realized test that physics does not satisfy similar properly derived inequality.

So where exactly is the problem? Are there errors in derivations?

Why physics does not always satisfy such looking obvious inequalities?

So let's take the simplest of such inequalities - drawing three coins, at least two give the same value:

P(A=B) + P(B=C) + P(A=C) >=1

Absolutely obvious, looks impossible to get otherwise ... but there is QM setting allowing to get 0.75 sum instead ( https://arxiv.org/pdf/1212.5214 ).

For CHSH there is experimentally realized test that physics does not satisfy similar properly derived inequality.

So where exactly is the problem? Are there errors in derivations?

Why physics does not always satisfy such looking obvious inequalities?

- Jarek
**Posts:**116**Joined:**Tue Dec 08, 2015 1:57 am

Jarek wrote:So let's take the simplest of such inequalities - drawing three coins, at least two give the same value:

P(A=B) + P(B=C) + P(A=C) >=1

That relationship is only valid if you draw three coins. Explain to me how you would apply it to an experiment in which you drew three separate pairs of coins. Would you say this new experiment "violates" the first relationship?

- minkwe
**Posts:**1019**Joined:**Sat Feb 08, 2014 10:22 am

Such inequalities should be understood by performing experiment independently multiple times and counting frequencies.

Classically, whatever probability distribution among 2^3=8 possibilities you choose, this equality has to be satisfied.

We indeed measure only 2 out of 3, but classically it doesn't matter - the third coin has some value, but we just ignore it.

Not to satisfy it in QM or MERW, it becomes essential that we measure only 2 out of 3 - the not measured coin has literally no value, otherwise the inequality has to be satisfied.

So one question is how to realize it that we measure exactly 2 out of 3 - such that the third one doesn't just have unknown value or is not flipped, but literally has no value? ... like coin hanging and rotating in the air ..

Additionally, that there exist conditions not satisfying the inequality - experimentally taking random amplitudes, such inequalities are usually satisfied, exceeding the classical threshold in only <1% of cases.

What other models allowing to exceed such threshold are known?

Classically, whatever probability distribution among 2^3=8 possibilities you choose, this equality has to be satisfied.

We indeed measure only 2 out of 3, but classically it doesn't matter - the third coin has some value, but we just ignore it.

Not to satisfy it in QM or MERW, it becomes essential that we measure only 2 out of 3 - the not measured coin has literally no value, otherwise the inequality has to be satisfied.

So one question is how to realize it that we measure exactly 2 out of 3 - such that the third one doesn't just have unknown value or is not flipped, but literally has no value? ... like coin hanging and rotating in the air ..

Additionally, that there exist conditions not satisfying the inequality - experimentally taking random amplitudes, such inequalities are usually satisfied, exceeding the classical threshold in only <1% of cases.

What other models allowing to exceed such threshold are known?

- Jarek
**Posts:**116**Joined:**Tue Dec 08, 2015 1:57 am

Jarek wrote:Not to satisfy it in QM or MERW, it becomes essential that we measure only 2 out of 3 - the not measured coin has literally no value, otherwise the inequality has to be satisfied.

So one question is how to realize it that we measure exactly 2 out of 3 - such that the third one doesn't just have unknown value or is not flipped, but literally has no value? ... like coin hanging and rotating in the air ..

This is the mystery of quantum mechanics.

I'm at Växjö where 50+ people have been discussing this question for a week.

I presented novel statistical analyses of the data of the famous Bell-inequality experiments of 2015 and 2016: Delft, NIST, Vienna and Munich. Every statistical analysis relies on statistical assumptions. I’ll make the traditional, but questionable, i.i.d. assumptions. They justify a novel (?) analysis which is both simple and (close to) optimal.

It enables us to fairly compare the results of the two main types of experiments: NIST and Vienna CH-Eberhard “one-channel” experiment with target settings and state chosen to optimise the handling of the detection loophole (detector efficiency > 66.7%); Delft and Munich CHSH “two channel” experiments based on entanglement swapping, with the target state and settings which achieve the Tsirelson bound (detector efficiency ≈ 100%).

One cannot say which type of experiment is better without agreeing on how to compromise between the desires to obtain high statistical significance and high physical significance. Moreover, robustness to deviations from traditional assumptions is also an issue.

I also discussed my current opinions on the question: what should we now believe about locality and realism and the foundations of quantum mechanics. My provisional conclusion is "exquisite/angelic spukhafte Fernwerkung" ... but tempered with a quantum Buddhist point of view - *nothing* is real, words are inadequate because each word is already a "model:. All models are false, though some are useful...

https://www.slideshare.net/gill1109/yet-another-statistical-analysis-of-the-data-of-the-loophole-free-experiments-of-2015-revised

- gill1109
- Mathematical Statistician
**Posts:**1479**Joined:**Tue Feb 04, 2014 10:39 pm**Location:**Leiden

This is the mystery of quantum mechanics.

Indeed that's the main question here.

Could you maybe comment this simple realization (page 9 of https://arxiv.org/pdf/0910.2724 ) - of violation of

Pr(A=B) + Pr(A=C) + Pr(B=C) >= 1

inequality from uniform distribution among paths (MERW)?

So the considered space is the graph on the left with all 8 values of ABC: in 000 and 111 we have to stay, in the remaining vertices we can jump to a neighbor.

The presented measurement in time=0 ignores C - we have 4 possible outcomes (red squares) determining exactly AB.

Assuming uniform probability distribution among paths (from -infinity to +infinity in time), we get Pr(A=B) = (1^2 + 1^2) / (1^2 + 2^2 + 2^2 + 1^2) = 2/10.

Analogously for the remaining pairs, we finally get

Pr(A=B) + Pr(A=C) + Pr(B=C) = 6/10

It is able not to satisfy the inequality thanks to:

- maintaining not measured third value,

- by using ensemble of complete paths, what leads to Born rule: probability of alternative of disjoint events is proportional to sum of squares of their amplitudes.

Considering path ensemble toward only one direction (past or future), we would have first power instead of square.

What other models violating such inequality are considered?

- Jarek
**Posts:**116**Joined:**Tue Dec 08, 2015 1:57 am

Jarek wrote:Could you maybe comment this simple realization (page 9 of https://arxiv.org/pdf/0910.2724 ) - of violation of

Pr(A=B) + Pr(A=C) + Pr(B=C) >= 1

inequality from uniform distribution among paths (MERW)?

So the considered space is the graph on the left with all 8 values of ABC: in 000 and 111 we have to stay, in the remaining vertices we can jump to a neighbor.

The presented measurement in time=0 ignores C - we have 4 possible outcomes (red squares) determining exactly AB.

Assuming uniform probability distribution among paths (from -infinity to +infinity in time), we get Pr(A=B) = (1^2 + 1^2) / (1^2 + 2^2 + 2^2 + 1^2) = 2/10.

Analogously for the remaining pairs, we finally get

Pr(A=B) + Pr(A=C) + Pr(B=C) = 6/10

It is able not to satisfy the inequality thanks to:

- maintaining not measured third value,

- by using ensemble of complete paths, what leads to Born rule: probability of alternative of disjoint events is proportional to sum of squares of their amplitudes.

Considering path ensemble toward only one direction (past or future), we would have first power instead of square.

What other models violating such inequality are considered?

I don't think that this is a *simple* realisation of violation of . But it is very nice that you can "emulate" some quantum mechanics predictions by some classical stochastic processes.

If, in Schrödinger's equation, you replace the time variable "t" by "sqrt -1 times t", you find an analytical formula for Brownian motion (if I remember this correctly!). Indeed, analytical results about QM can sometimes be obtained from corresponding analytical results about BM by doing analytic continuation. I also know that there is a quantum probability approach which connects Gaussian, Poisson and Bernoulli processes ... by wave-particle duality, you can extract both Brownian motion and the continuous time Poisson process from one quantum stochastic process.

I'm a mathematician, not a physicist, so I can't comment on the "physics" aspects of your work. On the other hand, I find the mathematical side very hard to "decode".

I notice that your arXiv paper has gone through three revisions and is apparently not yet published in a regular journal. Have other people studied "maximal entropy random walk"? In particular, is it studied in probability theory? Can you explain, to a mathematician who knows no physics, how your mapping between quantum mechanics and random walk theory is constructed and why it works? Is it "just" the trick which I just described?

You might be interested in also joining the Google-group on Bell inequalities and foundations of quantum mechanics which was the subject of another thread on this forum.

- gill1109
- Mathematical Statistician
**Posts:**1479**Joined:**Tue Feb 04, 2014 10:39 pm**Location:**Leiden

MERW ( https://en.wikipedia.org/wiki/Maximal_E ... andom_Walk ) is choosing random walk maximizing entropy production for a given graph, or equivalently: assuming uniform(/Boltzmann) probability distribution among paths. There are now >100 citations of our 2009 introductory PRL "Localization of MERW" paper.

QM Feynman path integrals are similar after Wick rotation, called e.g. euclidean quantum mechanics and widely used in numerical calculations to find the ground state. However, beside focusing on continuous case, its original propagator does not maintain probability distributions (normalization to 1) - what is repaired in MERW.

Anyway, such ensembles nicely show where the squares in Born rules rho~psi^2 come from - like in https://en.wikipedia.org/wiki/Two-state ... _formalism :

- one psi comes from the past: ensemble of past paths, propagator from -infinity to now,

- second psi comes from the future: ensemble of future paths, propagator from +infinity to now.

MERW here is just a simplified model showing that time symmetry can lead to nonstandard type of probabilistics:

1) Standard, intuitive: probability of alternative of disjoint events is sum of their probabilities,

2) Born rules (QM, MERW): probability of alternative of disjoint events is proportional to sum of squares of their amplitudes.

Bell-like equalities are derived assuming 1), hence models using 2) instead might exceed their threshold.

Above diagram is such simple example (I have also for CHSH): we need eight vertices for ABC, the graph is chosen to lead to amplitudes allowing for violation, measurement is exactly what we want: of 2 out of 3 values.

Assuming uniform distribuiton of paths only toward one time direction (past or future), we would have first power 1), inequality would be satisfied.

But thanks of using ensemble of full trajectories, we get the squares 2), which allow to exceed the classical threshold.

Here is stationary distribution on [0,1] infinite well from perspective of 3 philosophies for stochastic models:

I see MERW mainly as an educative model here - to understand the source of Born rules, earlier Anderson localization - standard diffusion models fail to obtain due to only approximating the maximal entropy principle required by statistical physics models.

I haven't found self confidence to even try to publish it, but I am open for a collaboration if somebody see it publishable.

Sure I can discuss on the Google group if getting invitation (dudajar@gmail.com), but it will be similar like here.

QM Feynman path integrals are similar after Wick rotation, called e.g. euclidean quantum mechanics and widely used in numerical calculations to find the ground state. However, beside focusing on continuous case, its original propagator does not maintain probability distributions (normalization to 1) - what is repaired in MERW.

Anyway, such ensembles nicely show where the squares in Born rules rho~psi^2 come from - like in https://en.wikipedia.org/wiki/Two-state ... _formalism :

- one psi comes from the past: ensemble of past paths, propagator from -infinity to now,

- second psi comes from the future: ensemble of future paths, propagator from +infinity to now.

MERW here is just a simplified model showing that time symmetry can lead to nonstandard type of probabilistics:

1) Standard, intuitive: probability of alternative of disjoint events is sum of their probabilities,

2) Born rules (QM, MERW): probability of alternative of disjoint events is proportional to sum of squares of their amplitudes.

Bell-like equalities are derived assuming 1), hence models using 2) instead might exceed their threshold.

Above diagram is such simple example (I have also for CHSH): we need eight vertices for ABC, the graph is chosen to lead to amplitudes allowing for violation, measurement is exactly what we want: of 2 out of 3 values.

Assuming uniform distribuiton of paths only toward one time direction (past or future), we would have first power 1), inequality would be satisfied.

But thanks of using ensemble of full trajectories, we get the squares 2), which allow to exceed the classical threshold.

Here is stationary distribution on [0,1] infinite well from perspective of 3 philosophies for stochastic models:

I see MERW mainly as an educative model here - to understand the source of Born rules, earlier Anderson localization - standard diffusion models fail to obtain due to only approximating the maximal entropy principle required by statistical physics models.

I haven't found self confidence to even try to publish it, but I am open for a collaboration if somebody see it publishable.

Sure I can discuss on the Google group if getting invitation (dudajar@gmail.com), but it will be similar like here.

- Jarek
**Posts:**116**Joined:**Tue Dec 08, 2015 1:57 am

Jarek wrote:Sure I can discuss on the Google group if getting invitation (dudajar@gmail.com), but it will be similar like here.

New thread has been started there.

- gill1109
- Mathematical Statistician
**Posts:**1479**Joined:**Tue Feb 04, 2014 10:39 pm**Location:**Leiden

gill1109 wrote:Jarek wrote:Sure I can discuss on the Google group if getting invitation (dudajar@gmail.com), but it will be similar like here.

New thread has been started there.

Yes, I saw. It is kind of hard to follow. I hear the word entropy and for some unknow reason, I shut down.

- FrediFizzx
- Independent Physics Researcher
**Posts:**1559**Joined:**Tue Mar 19, 2013 7:12 pm**Location:**N. California, USA

Please write if I can explain something.

The simplest way to see (Shannon's) entropy is through asymptotic behavior of Newton's binomial (number of combinations):

binomial(n,p*n) ~ 2^{n*h(p)}

for h(p) = - p lg(p) - (1-p) lg(1-p)

It is for example in the heart of data compressors - to store symbols of {p,1-p} probability distribution you need asymptotically h(p) bits/symbol: https://en.wikipedia.org/wiki/Asymmetri ... opy_coding

MERW is random walk chosen accordingly to (Jaynes): https://en.wikipedia.org/wiki/Principle ... um_entropy

For example having n white or black balls, not knowing anything more and asking for percentage p of white balls, the safest assumption is p=1/2 as it maximizes h(p) - this choice will asymptotically dominate the number of possibilities as it is exponent in binomial(n,p*n) ~ 2^{n*h(p)}.

Uniform distribution maximizes entropy - is the safest choice if not knowing anything more (otherwise Boltzmann distribution is used - also minimizing mean energy).

For MERW it means assuming uniform/Boltzmann distribution among entire paths.

Standard diffusion models (GRW) assume uniform distribution among single steps - maximizing entropy locally, but it often turns out suboptimal globally: looking at average entropy per step - averaging over probability distribution of position.

Slides: https://www.dropbox.com/s/prwvp0tfbv3yy ... sem_UJ.pdf

The simplest way to see (Shannon's) entropy is through asymptotic behavior of Newton's binomial (number of combinations):

binomial(n,p*n) ~ 2^{n*h(p)}

for h(p) = - p lg(p) - (1-p) lg(1-p)

It is for example in the heart of data compressors - to store symbols of {p,1-p} probability distribution you need asymptotically h(p) bits/symbol: https://en.wikipedia.org/wiki/Asymmetri ... opy_coding

MERW is random walk chosen accordingly to (Jaynes): https://en.wikipedia.org/wiki/Principle ... um_entropy

The principle of maximum entropy states that the probability distribution which best represents the current state of knowledge is the one with largest entropy

For example having n white or black balls, not knowing anything more and asking for percentage p of white balls, the safest assumption is p=1/2 as it maximizes h(p) - this choice will asymptotically dominate the number of possibilities as it is exponent in binomial(n,p*n) ~ 2^{n*h(p)}.

Uniform distribution maximizes entropy - is the safest choice if not knowing anything more (otherwise Boltzmann distribution is used - also minimizing mean energy).

For MERW it means assuming uniform/Boltzmann distribution among entire paths.

Standard diffusion models (GRW) assume uniform distribution among single steps - maximizing entropy locally, but it often turns out suboptimal globally: looking at average entropy per step - averaging over probability distribution of position.

Slides: https://www.dropbox.com/s/prwvp0tfbv3yy ... sem_UJ.pdf

- Jarek
**Posts:**116**Joined:**Tue Dec 08, 2015 1:57 am

Manfried Faber has just written paper with analogous Bell violation construction, but using only paths up to a given moment:

https://arxiv.org/pdf/1907.00175

https://arxiv.org/pdf/1907.00175

- Jarek
**Posts:**116**Joined:**Tue Dec 08, 2015 1:57 am

Jarek wrote:Manfried Faber has just written paper with analogous Bell violation construction, but using only paths up to a given moment:

https://arxiv.org/pdf/1907.00175

Thanks, this is interesting. Of course he hasn’t disproved Bell’s theorem. But if you like, you can think of Bell and all that as something to do with Fourier analysis and that has everything to do with deep connections in mathematics between the discrete and the continuous, between probability and complex analysis. If a physicist can get “intuition” about these deep mathematical connections, and use it creatively to do physics, well, good for them!

Exciting devopments concerning the Riemann problem are going on at the moment, thanks to a deep connection with random Gaussian matrix theory. Everything is connected! Probability is more and more gaining a respected place in the heart of mathematics.

- gill1109
- Mathematical Statistician
**Posts:**1479**Joined:**Tue Feb 04, 2014 10:39 pm**Location:**Leiden

Richard, while I can analogously violate original Bell or CHSH, the main reason we use "P(A=B) + P(B=C) + P(A=C) >=1" inequality: "tossing 3 coins, at least 2 are equal" is that it is absolutely obvious - it leaves no place for hiding in some sophisticated mathematics, even Fourier analysis.

This is just trivial combinatorics, which QM formalism allows to violate.

While you can try to argue other inequalities, this one does not leave a place for that - you can practically only use combinatorics to explain how physics can violate it ... and the simplest option: uniform path ensemble, allows for such explanation.

It is nonlocal in standard sense, but is "4D local": in space-time.

Please comment - do you agree with such Born rule/Bell violation, or explain why not? ... and suggest an alternative explanation how physics can violate such obvious inequality?

This is just trivial combinatorics, which QM formalism allows to violate.

While you can try to argue other inequalities, this one does not leave a place for that - you can practically only use combinatorics to explain how physics can violate it ... and the simplest option: uniform path ensemble, allows for such explanation.

It is nonlocal in standard sense, but is "4D local": in space-time.

Please comment - do you agree with such Born rule/Bell violation, or explain why not? ... and suggest an alternative explanation how physics can violate such obvious inequality?

- Jarek
**Posts:**116**Joined:**Tue Dec 08, 2015 1:57 am

Jarek wrote:Richard, while I can analogously violate original Bell or CHSH, the main reason we use "P(A=B) + P(B=C) + P(A=C) >=1" inequality: "tossing 3 coins, at least 2 are equal" is that it is absolutely obvious - it leaves no place for hiding in some sophisticated mathematics, even Fourier analysis.

This is just trivial combinatorics, which QM formalism allows to violate.

While you can try to argue other inequalities, this one does not leave a place for that - you can practically only use combinatorics to explain how physics can violate it ... and the simplest option: uniform path ensemble, allows for such explanation.

It is nonlocal in standard sense, but is "4D local": in space-time.

Please comment - do you agree with such Born rule/Bell violation, or explain why not? ... and suggest an alternative explanation how physics can violate such obvious inequality?

I know the inequality is a trivial inequality in a certain context. In experiments where it is violated, the context must be false. You should ask yourself what were the *physical reasons* for supposing that still, the inequality should hold. The *physical reasons* are physics assumptions called "local realism". The reason that certain experiments can exhibit an apparent violation of these inequalities must be that *local realism* is not true.

In view of the 2015 and later very very rigorous and carefully executed experiments, i conclude that local realism is false. In particular, this means that I'm inclined to believe in irreducible randomness as being part of the very fabric of reality. Our brains can't "understand" it because of what "understanding" means to our human brains. They have been programmed by evolution to do their job, and in this case, the software/hardware is not up to the job. It's "spooky".

But you don't have to "understand" it. We know very well how to do the calculations which, so far, give fantastically accurate answers. Your approach does not give understanding of what is actually going on. It just gives alternative ways to calculate.

- gill1109
- Mathematical Statistician
**Posts:**1479**Joined:**Tue Feb 04, 2014 10:39 pm**Location:**Leiden

Indeed, while our intuition demands it, there are now no doubts that "local realism" is incorrect.

The question is how to repair it without referring to magic - like building models based on physical assumptions, which are also able to violate such inequalities - by the way pointing where our "local realism" intuition is wrong.

My point here is that this missing assumption is time/CPT symmetry, which is at heart of (Lagrangian mechanics) theories we use in all scales: from QFT, unitary QM to GRT, but is against our natural intuition: which is very time asymmetric.

Using time-symmetric "4D local realism" instead: in space-time, the basic object is no longer particle, but its trajectory - continuous for 4D locality.

Ensembles of such objects: Feynman in QM, or statistical physics: Boltzmann in MERW, for example have strong Anderson localization - like real physics, unlike standard diffusion - which is local in standard sense and often very wrong, while MERW is nonlocal in standard sense ... but is local in 4D sense: is just path ensemble.

And the constructions here show that we get also other nonintuitive quantum properties from "4D local realism": using path ensembles - Born rule and resulting Bell violation.

What do think about such "4D local realism: path ensembles" explanation of quantum "spookiness"? Do you know a better explanation?

The question is how to repair it without referring to magic - like building models based on physical assumptions, which are also able to violate such inequalities - by the way pointing where our "local realism" intuition is wrong.

My point here is that this missing assumption is time/CPT symmetry, which is at heart of (Lagrangian mechanics) theories we use in all scales: from QFT, unitary QM to GRT, but is against our natural intuition: which is very time asymmetric.

Using time-symmetric "4D local realism" instead: in space-time, the basic object is no longer particle, but its trajectory - continuous for 4D locality.

Ensembles of such objects: Feynman in QM, or statistical physics: Boltzmann in MERW, for example have strong Anderson localization - like real physics, unlike standard diffusion - which is local in standard sense and often very wrong, while MERW is nonlocal in standard sense ... but is local in 4D sense: is just path ensemble.

And the constructions here show that we get also other nonintuitive quantum properties from "4D local realism": using path ensembles - Born rule and resulting Bell violation.

What do think about such "4D local realism: path ensembles" explanation of quantum "spookiness"? Do you know a better explanation?

- Jarek
**Posts:**116**Joined:**Tue Dec 08, 2015 1:57 am

99 posts
• Page **2** of **5** • 1, **2**, 3, 4, 5

Return to Sci.Physics.Foundations

Users browsing this forum: ahrefs [Bot] and 4 guests