In the small town near Milano where I live they are building condos with solar panel on their roofs, both thermal and photovoltaic, and air conditioning coming from the ground. I think there is a European law saying that from 2020 all new buildings must be selfsufficient in energy terms. This is a way to reduce both imports of methane gas from Russia and air pollution.
Tullio

Here in California it's possible to make your electric meter spin backwards (well, the newer meters are digital), by generating your own electricity â€“ if you generate more than you use, the excess flows back through the meter to help supply the grid, so instead of getting a bill for electricity, you get a check. It's called Net (Energy) Metering

Another interesting possible route to fusion that wasn't covered in the previous link is Bubble fusion, a type of inertial confinement. There are several types of confinement, most of which are either inertial or magnetic. The conditions required to produce enough energy from the reaction for it to sustain itself are given in a general way with the Lawson criterion â€“ and at the bottom of that page are links to many different experiments of the various methods.

Mike, thanks for the link to the James Binney lectures (in one of the other threads). They've been a big help for becoming more familiar with the 'bra-ket' notation and especially with how to perform calculations. I thought it would be a good idea to watch each one the first time through, just following along in my head. But in one of the lectures (#6, I think), after referring to notes to help find an omitted sigma-squared term a few steps back, he says, â€œI was trying to do some of this in my head, which is dangerous.â€ :)

After watching the lectures on potential wells and tunneling, still thinking dangerously, I thought I'd try googling 'fusion wave functions' and I ran across some physics on the infamous 'cold fusion': Fleischmann Pons fusion â€“ the website looks like a work in progress but what's there so far sounds plausible, especially since muon-catalyzed fusion occurs. Actually, from reading the Wiki article on cold fusion, I wonder - could the intermittent heating results (and 'heat-after-death') have been caused by muons from cosmic rays? I'm guessing most are moving too fast but I'm pretty sure there's a range of velocities from near-light speed to zero (where a muon would come to a complete stop at a nucleus somewhere in the middle of the calorimeter).

To give a rough idea of how many muons may have been zipping through their experiments, watch the MINOS-Live Event Viewer for a couple minutes and count how many cosmic muons are detected - and this is a (neutrino) detector operating in an an old iron mine about 700 meters underground! In an overview of the experiment it's stated that the cosmic ray rate at this detector (the far one) is about once a second, while at the near detector (nearer to Fermilab), closer to the earth's surface (depth of only 100 meters), the cosmic ray rate is 270 times a second. So multiply what you watched by a factor of about 300 for a cold fusion experiment on the earth's surface, would that be right?

There's also a link to the latest events of each type (here) and if you scroll half way down you'll see the event classification called â€œLast high multiplicity multiple muonâ€ - I think these happen several times a day. Scroll all the way down to see the â€œLast ultra high multiplicity multiple muonâ€ event, from back in 2005. Gotta wonder what got hit by what to produce that!

Mike, thanks for the link to the James Binney lectures (in one of the other threads). They've been a big help for becoming more familiar with the 'bra-ket' notation and especially with how to perform calculations. I thought it would be a good idea to watch each one the first time through, just following along in my head. But in one of the lectures (#6, I think), after referring to notes to help find an omitted sigma-squared term a few steps back, he says, â€œI was trying to do some of this in my head, which is dangerous.â€ :)

Yup, I thought James Binney's approach was quite good. Quantum mechanics is such a subtle topic that I don't think it hurts to gain different or fresh views.

Historically QM had two lines of development : wave mechanics and matrix mechanics. One emphasized the wave-function/differential-equation evolution aspect, the other state transitions via operators. Both were eventually reconciled as equivalent. Not being especially well versed in matrices at the time, Heisenberg more-or-less inadvertently re-invented alot of matrix mathematics when producing his matrix version. So a 'ket' is a specific vector/list of complex numbers ( components resolved along some basis set ) that fully specify, say, the energy spectrum of a system. Collectively all such kets form a vector space with familiar linear rules of combination. A 'bra' is a function that acts on kets eventually yielding, using other rules as well, physical predictions. The totality of bras also forms a different vector space entirely.

It's a neat system/notation and requires careful thought for sure, but it's the best found so far to deal with the tricky-dicky probability aspect. That's why most bemoan the anti-intuitive nature of QM : you crunch the machine and the right answer comes out even if it doesn't feel right.

And don't get me started on 'spin'!! Why can't you know all the directional components at once ?! You can't even get two out of three directions. One can resolve along some single preferred direction, but as soon as you do you then lose the projections onto the perpendicular planes. But it is so real. Whack some suitable stuff through a Stern-Gerlach apparatus and low and behold the beam splits into discrete streams. Chain S-G apparati together, with different relative orientations for each of their preferred directions, and the 'hidden' degrees scramble. It comes down to being in a 3-D world and non-commutation, and it's just not fair ..... ;-) :-)

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

For all interested in QM, the QuantumFire project, based in Oxford University, is trying to support the De Broglie-Bohm version of QM with its pilot wave and against the standard Copenhagen interpretation. I am taking part in that project.
Tullio

And don't get me started on 'spin'!! Why can't you know all the directional components at once ?! You can't even get two out of three directions. One can resolve along some single preferred direction, but as soon as you do you then lose the projections onto the perpendicular planes. But it is so real. Whack some suitable stuff through a Stern-Gerlach apparatus and low and behold the beam splits into discrete streams. Chain S-G apparati together, with different relative orientations for each of their preferred directions, and the 'hidden' degrees scramble. It comes down to being in a 3-D world and non-commutation, and it's just not fair ..... ;-) :-)

Cheers, Mike.

Hmm, instead of chaining them, how about â€œco-locatingâ€ them but rotated about the particle beam path?
(Click thumbnails for full size image)
Would you get the same number of spots in the target area as there are poles? (ie, 6 spots for this type). Or would you get some kind of mix? What if the modified apparatus was made to rotate?

Quote:

For all interested in QM, the QuantumFire project, based in Oxford University, is trying to support the De Broglie-Bohm version of QM with its pilot wave and against the standard Copenhagen interpretation. I am taking part in that project.
Tullio

Thanks Tullio, that sounds very interesting! Very cool overview in the 'Quantum Foundations and Pilot Wave Theory' link. I recall asking (years ago, in some thread) about 'hidden variables' and the results of Bell's inequality was one of the responses. Later I found a Wikipedia article on pilot wave theory ...

So I'm less hesitant to mention some trouble I'm having with the lecture on Bell's inequality (Binney, #17 I think). It's one I had to watch twice before going on, and I still have questions: like, it's a great idea of Bell's, introducing a variable to account for 'hidden variables', but in calculations of the probabilities it gets â€œaveragedâ€ out of the equation? How does that not defeat the purpose of introducing it? But more to the point, don't the rules, mechanics, and kets of QM constitute a set of 'hidden variables' that should have been accounted for by the variable Bell introduced specifically for that purpose?

Hmm, instead of chaining them, how about â€œco-locatingâ€ them but rotated about the particle beam path?

What I mean is : you split the beam, select one of the sub-beams and use a rotated S-G on that. Grab one of it's sub-beams and apply yet another S-G using the original orientation. You get just as many sub-beams again. The only difference is intensity as you've been throwing out lots of beams.

You thought you selected, say, the z-component of spin first up. And you did. But that doesn't persist. As soon as you select another direction with the second S-G, that first selection gets lost. Subsequent analysis in the z-direction with the third S-G is just like the first. Indeed the beams are lines, not spots, discretely spaced with respect to an S-G's selecting direction, but spread orthogonal to that.

So what is labeled as the total spin vector never gets measured at any instant. For a spin 1/2 it is deemed, via Pythagorus, to have a total magnitude equal to the longest diagonal of a cube [ sqrt(3) times a single side length ]. But you only measure along one cube edge at a time, not then knowing about the others. So you know only which face of the cube you are on, and you are welcome to change faces as much as you like! We hypothecate a 'normal' 3-D vector but find it doesn't work how we 'expect'.

Co-locating doesn't work because one then spoils any preferred directions .... or put another way, the magnetic fields add linearly so you wind up with a single direction anyway - and it selects on that. Catch-22 .... :-)

Cheers, Mike.

( edit ) This is a generic issue with what are stated as conjugate variables in QM. Measuring one worsens the measurement of another. So in the x-direction, say, momentum and distance are like that. Select or narrow the beam, via a slit say, and it sprays out beyond that point - diffraction. It is as though there is some 'vector' in action space ( ie. with units of angular momentum thus of Planck's constant ) that plays hide-n-seek with it's components. Can't measure both at one event. As spin is angular momentum then exact knowledge of one component completely blows off the other two. So that's why you only get one of three, and not two of three spin components. If you did know two and then applied conservation of angular momentum one could deduce the third you see ...... and the demons of QM won't allow that! :-)

( edit ) The 'normal' 3-D vector approach to thinking about spin isn't totally hopeless though. With spin and angular momentum involved one can arrange matters to cause precession about some axis. That doesn't yield a full 3-D direction at any moment in time, but by assuming that there is such a thing ( even though we can't measure it from one instant to the next ) gives a frequency of said precession that agrees well with experiment. Mind you we had to give up the idea of exact orbits ala Newton when considering electrons in atoms, and that did go very well in the end. So perhaps it's better to say that the QM truth which is beyond our macroscopic realm of measurement is real, and our efforts are approximate. So the wave function is deterministic - it's just the measurement which fails! Welcome to the QM tarpit. :-)

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

So I'm less hesitant to mention some trouble I'm having with the lecture on Bell's inequality (Binney, #17 I think). It's one I had to watch twice before going on, and I still have questions: like, it's a great idea of Bell's, introducing a variable to account for 'hidden variables', but in calculations of the probabilities it gets â€œaveragedâ€ out of the equation? How does that not defeat the purpose of introducing it? But more to the point, don't the rules, mechanics, and kets of QM constitute a set of 'hidden variables' that should have been accounted for by the variable Bell introduced specifically for that purpose?

The Bell construct is like a proof by contradiction .... you introduce a concept to show the difficulties occurring because you did that. Ergo the concept is false, or unnecessary in this case. This is reminiscent of Einstein's comment on the aether - he didn't originally say it was or wasn't there, but that it was superfluous. So there's an implicit invocation of Occam's Principle too.

What Bell showed was that having hidden variables does not add to predictions - results are the same in either case - so why have them? As for the other QM components you mention, they are present because they do influence predictions. You change a ket or somesuch and the outcomes change ( measurements alter ). The truly annoying bit is that outcome here means 'probability distribution' or 'expected value' in some statistical sense.

This is not a measurement issue in the 'old' style meaning of a crude device that needs adjusting to refine accuracy, or someone bumped the equipment, BUT that the variable in question did-not/does-not/never-will possess an exact value. Some explanations of Heisenberg's Uncertainty Principle are misleading in that they focus the discussion on interaction energies ( finer distance resolution needs higher momenta, with said momenta overtly altering the paths etc ) - all statements true and correct - BUT that doesn't get to the heart of the conundrum.

If an electron leaves an electron gun and later is discovered to hit a distant screen - the intervening space containing a wall with two slits - then analysis of counts over a large number of electrons leads to an irresistible conclusion that each electron interfered with itself and we cannot even say which slit any of them took. If we plonk a 'which way' device to catch them at one or other slit we change the experiment to a different one altogether : one where interference patterns on the distant screen do not occur!!

Thus you can't talk emphatically of what you can't measure. You can't say the electron took slit A or slit B if no measurement supports that. You can't say a spin half particle has a total spin vector of certain magnitude and direction, we thus defer to statements of measurable projections or components. You can't say an electron orbits with so-and-so shape path of given dimensions in an atom when a single interaction ionises it. The 'old' quantum theory of Bohr with Sommerfeld talked of electromagnetically bound solar system type orbits : radii, energies, eccentricities etc. It sort of fit some results but left a vast amount of experimental data unexplained.

Thus outcomes morph to probability. For measurement this means/implies several things :

- you can only reliably talk of data in groups. One point does not a curve make. Even worse .....

- a 'full' measurement has infinite points. So if your calculation leads to some mathematical distribution then I can only approximate that distribution in the lab with a finite run of results. So if we get a Higg's detection at the LHC that really means we have data grouped as Higg's detections - plural. The Higg's mass, say, will be derived from a mean of a series of points. I think even the width of such a distribution tells us other stuff too - decay modes perhaps?

- more data means a better answer, providing each point is likewise obtained. So if I pass a succession of photons through an optical system, to be rigorous I have to hold circumstances constant. Or if I throw a bucket of electrons in then I want each to start their journey in the same state. There was an experimenter who did a two slit job with photons from a source of such a low intensity that he went sailing for a few months while the interference curve 'cooked'. Interference did not require any two photons to be around at about the same time - hence the statement 'it interferes with itself'.

- all states suffer probability. What I think is an initial preparation has variance, a 'which way' slit gadget does too, as well as the final screen. So a set of electrons emitted from some gun are never all the same, initially exact with variations creeping in later on.

- in practice the probability widths & variances rapidly go to effectively zero ( or below resolution limits ) with scale. Classical limit, Correspondence Principle or effective theory et al. Put another way : h-bar is really small in human scale units.

I suppose another alternative is to 'blame' the environment for the probability. So the photon/electron is a definite thing but goes/travels by some path because of it's circumstances. Which is true in that opening and closing slits etc will change matters. But here we are just labeling. What is the wall and slit but yet another collection of electrons and the like? Do we have the wave function for the surrounds included in our kets? As separated terms, no. What we have is a description that includes the differences in surroundings - [ wall with 2 slits open ] minus [ wall with one slit open ]. Or [ two slit setup with 'which way' device ] minus [ two slit setup without 'which way' device ]. Can you say what [ wall ] or [ two slit setup ] mean by themselves?

Cheers, Mike.

( edit ) So I guess the kets |1> and |2> for an electron going thru slit 1 and 2 respectively each incorporate the 'environs' assumptions of the experiment : a gun here, a wall there, a slit and then another slit in given positions, with a screen thus far away etc.

( edit ) Yeah, you'd have a 'dark current' assumption too. Meaning what does the screen show when the electron/photon gun is 'off' ?

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Co-locating doesn't work because one then spoils any preferred directions .... or put another way, the magnetic fields add linearly so you wind up with a single direction anyway - and it selects on that. Catch-22 .... :-)

What about a cylindrical version of a Halbach array? It would provide a field which is symmetrical in the plane orthogonal to the beam path, and for the cylindrical variety where k > 2 it's guaranteed to be an inhomogeneous field. I understand what happens when S-G's are linked one after another. Why not try a simultaneous measure of at least two axes? How do you know that spin occurs in more than one axis, anyway? Maybe the fundamental wave/particle geometry, being half-integer amounts of spin to begin with, is only able to spin in one axis at a time? Maybe they can be treated as point-like, but are they 1-dimensional segments or 2-dimensional loops/disks or what? I was thinking that could be determined using sections of a cylindrical Halbach array that vary in their radius, length along the beam path, and vary in k-number from k = 3 to how ever many is possible. And additionally using the output of an S-G as the inputs to these sections would be a good way to be certain of the initial states, right?

-edit: thanks for getting to the question on Bell's inequality, will digest it as soon as I can :)

The problem I have is with the Standard Model.. Where is all this anti-matter. The world spends a lot of money looking for particles based on this model.

Thank you,Sir! I think the world might be missing something focusing all attention on unifying the gravitational force with the other three fundamental forces(Strong,Weak and Electromagnetic forces).There are antimatters. On a nutshell, positron being an antiparticle of electron with its district spin, +1/3 charge and -2/3 mass of electron yet of variation in density. My conclusion is that, there is no time wasted exploiting other avenues without dealing harm to E=mcÂ².

NB: Maybe it could be as elusive as "I ask a friend of mine where is up, he pointed to the Sky. And where is down, he pointed to the Earth".

## RE: In the small town near

)

Here in California it's possible to make your electric meter spin backwards (well, the newer meters are digital), by generating your own electricity â€“ if you generate more than you use, the excess flows back through the meter to help supply the grid, so instead of getting a bill for electricity, you get a check. It's called Net (Energy) Metering

Another interesting possible route to fusion that wasn't covered in the previous link is Bubble fusion, a type of inertial confinement. There are several types of confinement, most of which are either inertial or magnetic. The conditions required to produce enough energy from the reaction for it to sustain itself are given in a general way with the Lawson criterion â€“ and at the bottom of that page are links to many different experiments of the various methods.

Mike, thanks for the link to the James Binney lectures (in one of the other threads). They've been a big help for becoming more familiar with the 'bra-ket' notation and especially with how to perform calculations. I thought it would be a good idea to watch each one the first time through, just following along in my head. But in one of the lectures (#6, I think), after referring to notes to help find an omitted sigma-squared term a few steps back, he says, â€œI was trying to do some of this in my head, which is dangerous.â€ :)

After watching the lectures on potential wells and tunneling, still thinking dangerously, I thought I'd try googling 'fusion wave functions' and I ran across some physics on the infamous 'cold fusion': Fleischmann Pons fusion â€“ the website looks like a work in progress but what's there so far sounds plausible, especially since muon-catalyzed fusion occurs. Actually, from reading the Wiki article on cold fusion, I wonder - could the intermittent heating results (and 'heat-after-death') have been caused by muons from cosmic rays? I'm guessing most are moving too fast but I'm pretty sure there's a range of velocities from near-light speed to zero (where a muon would come to a complete stop at a nucleus somewhere in the middle of the calorimeter).

To give a rough idea of how many muons may have been zipping through their experiments, watch the MINOS-Live Event Viewer for a couple minutes and count how many cosmic muons are detected - and this is a (neutrino) detector operating in an an old iron mine about 700 meters underground! In an overview of the experiment it's stated that the cosmic ray rate at this detector (the far one) is about once a second, while at the near detector (nearer to Fermilab), closer to the earth's surface (depth of only 100 meters), the cosmic ray rate is 270 times a second. So multiply what you watched by a factor of about 300 for a cold fusion experiment on the earth's surface, would that be right?

There's also a link to the latest events of each type (here) and if you scroll half way down you'll see the event classification called â€œLast high multiplicity multiple muonâ€ - I think these happen several times a day. Scroll all the way down to see the â€œLast ultra high multiplicity multiple muonâ€ event, from back in 2005. Gotta wonder what got hit by what to produce that!

## RE: Mike, thanks for the

)

Yup, I thought James Binney's approach was quite good. Quantum mechanics is such a subtle topic that I don't think it hurts to gain different or fresh views.

Historically QM had two lines of development : wave mechanics and matrix mechanics. One emphasized the wave-function/differential-equation evolution aspect, the other state transitions via operators. Both were eventually reconciled as equivalent. Not being especially well versed in matrices at the time, Heisenberg more-or-less inadvertently re-invented alot of matrix mathematics when producing his matrix version. So a 'ket' is a specific vector/list of complex numbers ( components resolved along some basis set ) that fully specify, say, the energy spectrum of a system. Collectively all such kets form a vector space with familiar linear rules of combination. A 'bra' is a function that acts on kets eventually yielding, using other rules as well, physical predictions. The totality of bras also forms a different vector space entirely.

It's a neat system/notation and requires careful thought for sure, but it's the best found so far to deal with the tricky-dicky probability aspect. That's why most bemoan the anti-intuitive nature of QM : you crunch the machine and the right answer comes out even if it doesn't feel right.

And don't get me started on 'spin'!! Why can't you know all the directional components at once ?! You can't even get two out of three directions. One can resolve along some single preferred direction, but as soon as you do you then lose the projections onto the perpendicular planes. But it is so real. Whack some suitable stuff through a Stern-Gerlach apparatus and low and behold the beam splits into discrete streams. Chain S-G apparati together, with different relative orientations for each of their preferred directions, and the 'hidden' degrees scramble. It comes down to being in a 3-D world and non-commutation, and it's just not fair ..... ;-) :-)

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## For all interested in QM, the

)

For all interested in QM, the QuantumFire project, based in Oxford University, is trying to support the De Broglie-Bohm version of QM with its pilot wave and against the standard Copenhagen interpretation. I am taking part in that project.

Tullio

## RE: And don't get me

)

Hmm, instead of chaining them, how about â€œco-locatingâ€ them but rotated about the particle beam path?

(Click thumbnails for full size image)

Would you get the same number of spots in the target area as there are poles? (ie, 6 spots for this type). Or would you get some kind of mix? What if the modified apparatus was made to rotate?

Thanks Tullio, that sounds very interesting! Very cool overview in the 'Quantum Foundations and Pilot Wave Theory' link. I recall asking (years ago, in some thread) about 'hidden variables' and the results of Bell's inequality was one of the responses. Later I found a Wikipedia article on pilot wave theory ...

So I'm less hesitant to mention some trouble I'm having with the lecture on Bell's inequality (Binney, #17 I think). It's one I had to watch twice before going on, and I still have questions: like, it's a great idea of Bell's, introducing a variable to account for 'hidden variables', but in calculations of the probabilities it gets â€œaveragedâ€ out of the equation? How does that not defeat the purpose of introducing it? But more to the point, don't the rules, mechanics, and kets of QM constitute a set of 'hidden variables' that should have been accounted for by the variable Bell introduced specifically for that purpose?

## I think I made a mistake,

)

I think I made a mistake, citing Oxford as the location of the QuantumFire project. In fact it is the Cavendish Laboratory of Cambridge. Sorry.

Tullio

## This is an article on Bell

)

This is an article on Bell inequalities I found in "Nature" magazine:

Bell inequalities

Tullio

## RE: Hmm, instead of

)

What I mean is : you split the beam, select one of the sub-beams and use a rotated S-G on that. Grab one of it's sub-beams and apply yet another S-G using the original orientation. You get just as many sub-beams again. The only difference is intensity as you've been throwing out lots of beams.

You thought you selected, say, the z-component of spin first up. And you did. But that doesn't persist. As soon as you select another direction with the second S-G, that first selection gets lost. Subsequent analysis in the z-direction with the third S-G is just like the first. Indeed the beams are lines, not spots, discretely spaced with respect to an S-G's selecting direction, but spread orthogonal to that.

So what is labeled as the total spin vector never gets measured at any instant. For a spin 1/2 it is deemed, via Pythagorus, to have a total magnitude equal to the longest diagonal of a cube [ sqrt(3) times a single side length ]. But you only measure along one cube edge at a time, not then knowing about the others. So you know only which face of the cube you are on, and you are welcome to change faces as much as you like! We hypothecate a 'normal' 3-D vector but find it doesn't work how we 'expect'.

Co-locating doesn't work because one then spoils any preferred directions .... or put another way, the magnetic fields add linearly so you wind up with a single direction anyway - and it selects on that. Catch-22 .... :-)

Cheers, Mike.

( edit ) This is a generic issue with what are stated as conjugate variables in QM. Measuring one worsens the measurement of another. So in the x-direction, say, momentum and distance are like that. Select or narrow the beam, via a slit say, and it sprays out beyond that point - diffraction. It is as though there is some 'vector' in action space ( ie. with units of angular momentum thus of Planck's constant ) that plays hide-n-seek with it's components. Can't measure both at one event. As spin is angular momentum then exact knowledge of one component completely blows off the other two. So that's why you only get one of three, and not two of three spin components. If you did know two and then applied conservation of angular momentum one could deduce the third you see ...... and the demons of QM won't allow that! :-)

( edit ) The 'normal' 3-D vector approach to thinking about spin isn't totally hopeless though. With spin and angular momentum involved one can arrange matters to cause precession about some axis. That doesn't yield a full 3-D direction at any moment in time, but by assuming that there is such a thing ( even though we can't measure it from one instant to the next ) gives a frequency of said precession that agrees well with experiment. Mind you we had to give up the idea of exact orbits ala Newton when considering electrons in atoms, and that did go very well in the end. So perhaps it's better to say that the QM truth which is beyond our macroscopic realm of measurement is real, and our efforts are approximate. So the wave function is deterministic - it's just the measurement which fails! Welcome to the QM tarpit. :-)

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## RE: So I'm less hesitant to

)

The Bell construct is like a proof by contradiction .... you introduce a concept to show the difficulties occurring because you did that. Ergo the concept is false, or unnecessary in this case. This is reminiscent of Einstein's comment on the aether - he didn't originally say it was or wasn't there, but that it was superfluous. So there's an implicit invocation of Occam's Principle too.

What Bell showed was that having hidden variables does not add to predictions - results are the same in either case - so why have them? As for the other QM components you mention, they are present because they do influence predictions. You change a ket or somesuch and the outcomes change ( measurements alter ). The truly annoying bit is that outcome here means 'probability distribution' or 'expected value' in some statistical sense.

This is not a measurement issue in the 'old' style meaning of a crude device that needs adjusting to refine accuracy, or someone bumped the equipment, BUT that the variable in question did-not/does-not/never-will possess an exact value. Some explanations of Heisenberg's Uncertainty Principle are misleading in that they focus the discussion on interaction energies ( finer distance resolution needs higher momenta, with said momenta overtly altering the paths etc ) - all statements true and correct - BUT that doesn't get to the heart of the conundrum.

If an electron leaves an electron gun and later is discovered to hit a distant screen - the intervening space containing a wall with two slits - then analysis of counts over a large number of electrons leads to an irresistible conclusion that each electron interfered with itself and we cannot even say which slit any of them took. If we plonk a 'which way' device to catch them at one or other slit we change the experiment to a different one altogether : one where interference patterns on the distant screen do not occur!!

Thus you can't talk emphatically of what you can't measure. You can't say the electron took slit A or slit B if no measurement supports that. You can't say a spin half particle has a total spin vector of certain magnitude and direction, we thus defer to statements of measurable projections or components. You can't say an electron orbits with so-and-so shape path of given dimensions in an atom when a single interaction ionises it. The 'old' quantum theory of Bohr with Sommerfeld talked of electromagnetically bound solar system type orbits : radii, energies, eccentricities etc. It sort of fit some results but left a vast amount of experimental data unexplained.

Thus outcomes morph to probability. For measurement this means/implies several things :

- you can only reliably talk of data in groups. One point does not a curve make. Even worse .....

- a 'full' measurement has infinite points. So if your calculation leads to some mathematical distribution then I can only approximate that distribution in the lab with a finite run of results. So if we get a Higg's detection at the LHC that really means we have data grouped as Higg's detections - plural. The Higg's mass, say, will be derived from a mean of a series of points. I think even the width of such a distribution tells us other stuff too - decay modes perhaps?

- more data means a better answer, providing each point is likewise obtained. So if I pass a succession of photons through an optical system, to be rigorous I have to hold circumstances constant. Or if I throw a bucket of electrons in then I want each to start their journey in the same state. There was an experimenter who did a two slit job with photons from a source of such a low intensity that he went sailing for a few months while the interference curve 'cooked'. Interference did not require any two photons to be around at about the same time - hence the statement 'it interferes with itself'.

- all states suffer probability. What I think is an initial preparation has variance, a 'which way' slit gadget does too, as well as the final screen. So a set of electrons emitted from some gun are never all the same, initially exact with variations creeping in later on.

- in practice the probability widths & variances rapidly go to effectively zero ( or below resolution limits ) with scale. Classical limit, Correspondence Principle or effective theory et al. Put another way : h-bar is really small in human scale units.

I suppose another alternative is to 'blame' the environment for the probability. So the photon/electron is a definite thing but goes/travels by some path because of it's circumstances. Which is true in that opening and closing slits etc will change matters. But here we are just labeling. What is the wall and slit but yet another collection of electrons and the like? Do we have the wave function for the surrounds included in our kets? As separated terms, no. What we have is a description that includes the differences in surroundings - [ wall with 2 slits open ] minus [ wall with one slit open ]. Or [ two slit setup with 'which way' device ] minus [ two slit setup without 'which way' device ]. Can you say what [ wall ] or [ two slit setup ] mean by themselves?

Cheers, Mike.

( edit ) So I guess the kets |1> and |2> for an electron going thru slit 1 and 2 respectively each incorporate the 'environs' assumptions of the experiment : a gun here, a wall there, a slit and then another slit in given positions, with a screen thus far away etc.

( edit ) Yeah, you'd have a 'dark current' assumption too. Meaning what does the screen show when the electron/photon gun is 'off' ?

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## RE: Co-locating doesn't

)

What about a cylindrical version of a Halbach array? It would provide a field which is symmetrical in the plane orthogonal to the beam path, and for the cylindrical variety where k > 2 it's guaranteed to be an inhomogeneous field. I understand what happens when S-G's are linked one after another. Why not try a simultaneous measure of at least two axes? How do you know that spin occurs in more than one axis, anyway? Maybe the fundamental wave/particle geometry, being half-integer amounts of spin to begin with, is only able to spin in one axis at a time? Maybe they can be treated as point-like, but are they 1-dimensional segments or 2-dimensional loops/disks or what? I was thinking that could be determined using sections of a cylindrical Halbach array that vary in their radius, length along the beam path, and vary in k-number from k = 3 to how ever many is possible. And additionally using the output of an S-G as the inputs to these sections would be a good way to be certain of the initial states, right?

-edit: thanks for getting to the question on Bell's inequality, will digest it as soon as I can :)

## RE: The problem I have is

)

Thank you,Sir! I think the world might be missing something focusing all attention on unifying the gravitational force with the other three fundamental forces(Strong,Weak and Electromagnetic forces).There are antimatters. On a nutshell, positron being an antiparticle of electron with its district spin, +1/3 charge and -2/3 mass of electron yet of variation in density. My conclusion is that, there is no time wasted exploiting other avenues without dealing harm to E=mcÂ².

NB: Maybe it could be as elusive as "I ask a friend of mine where is up, he pointed to the Sky. And where is down, he pointed to the Earth".

http://teamnaija.freeforums.org/what-can-be-faster-than-the-speed-of-light-t4.html