by gill1109 » Tue Feb 04, 2014 11:37 pm
I think that Fred has not understood the transition from Section 2 to Section 3 of my paper.
Section 2 is about some elementary facts concerning an Nx4 spreadsheet of numbers +/-1. The columns are labelled A, A', B, B'. In each row, it is easy to check that AB + AB' + A'B - A'B' can only equal -2, 0 or +2. The average of (AB + AB' + A'B - A'B') over the rows therefore lies between -2 and +2. But of course it also equals ave(AB) + ave(AB') + ave(A'B) - ave(A'B'). Now suppose that for each row, all independently of one another, we completely randomly pick just one of the four products AB, AB', A'B, A'B'. Average the values of AB over the rows for which we chose AB, and so on. We'll get four new averages, each based on a random selection of about a quarter of the whole table. Each of these four averages will be close to the corresponding averages over the whole table. Hence with large probability, sample_ave(AB) + sample_ave(AB') + sample_ave(A'B) - sample_ave(A'B') will lie within a slightly larger interval from just below -2 to just about +2.
So the question is, does this have anything to do with quantum mechanics?
The answer is: not directly, but it would apply to a rigorous event-based local-realistic computer simulation of a Bell-CHSH experiment.
What do I mean by that?
There are going to be N runs: N times a pair of particles is created, sent to two distant locations. (N is, say, 10 000). At those two locations two observers randomly and independently choose a setting (angle) on a measurement device. Alice chooses every time randomly between settings a and a'. Bob between b and b'. The settings go into the measurement device, the particle arrives at the measurement device, and out comes an outcome +/-1.
Simulate this on a network of computers. N times, the following happens. Hidden variables from the source arrive at Alice's measurement computer, Alice doesn't see them but hits button 1 or button 2 to choose between a and a', i.e., to choose between observing A and A'. Similarly Bob does the same thing, far away (hits button 1 or button 2 to choose between b and b', i.e., to choose between observing B and B'. The computers print out the setting and the outcome. (Alice and Bob choose their settings by tossing fair coins.)
The point is that although Alice only gets to make one choice in each run, we can perfectly well define both outcomes which she would have observed, had she hit either button 1 or 2. They are just the result of running a certain computer problem with inputs 1, hidden variables from source, or with inputs 2, same hidden variables from source. Thus both A and A' "exist" - she just chooses randomly which one to actually observe.
If we suppose that the computer simulation does not exploit the memory loophole, thus each run can be considered completely separately from the previous runs, we see that we can use the computer programs to generate the Nx4 spreadsheet which I talked about before. (In other work I used martingale theory to extend the results from memory-less systems to systems with memory).
From a suite of computer programs simulating a Bell-CHSH experiment one generalizes to the metaphysical concept of a local realistic physical theory underlying the QM of that experiment. Or if you like, to the concept of a local hidden variables model underlying QM.
One may now ask, how come Joy Christian now claims that his model has been implemented in Python, Java, Mathematica ... and allows a perfect local-realistic event-based (i.e., particle by particle) simulation which violates the CHSH inequality?
If anyone is interested I can answer that question too, but maybe we should start a new thread for that topic.