The modern program of science ( QM particularly ) is that one can only restrain a theory's components by the success in prediction of observables. Thus non-observable components are unconstrained and can have any form provided the predictions run true to experiment.

[ Simplicity is also preferred, and perhaps some elegance etc too ... ]

Is this not where we invoke the Heisenberg (uncertainty) principle to predict the limits of what we can presently hope to detect with our presently available tools?

Quote:

Take a photon passing from point A to point B. Then who is to know what happened in between?

Indeed so, unless we can reliably and repeatedly intercept the photon (or multiple identical photons) at sample points between A and B. Hence such as the Young's slits experiment and variations.

Some suspicions for that example:

Rather than an individual photon 'instantaneously exploring' all possible paths, can not the photon energy be simply following a potential gradient? Just as for gravitational fields, the "photon potential field" is 'already there' and so there is then no requirement for superluminal 'probing' of all possible paths;

We had for many years the 'controversy' of unexpectedly low neutrino counts from our sun until the discovery that the neutrinos oscillate through different types of neutrino... Is there any evidence for or against the photon similarly perpetually oscillating between positron-electron and photon?

Quote:

[...]
With the energy form of the uncertainty relations :

dE * dt > h

[ dE = energy uncertainty, dt = time uncertainty, h = Planck's constant ]

then I can make particles with any total energy ( dE ) I like ( from 'empty' vacuum if I'm keen ) provided dt is short enough. ...

The Casimir effect...

The putative Hawking radiation at black hole event horizons follows a similiar pattern, except that one of a particle pair gets irretrievably lost 'over the fence' while the other escapes into the distance to be seen. Loss of energy = loss of mass -> the hole evaporates ....... eventually :-)

Yes, and that's a fascinating aspect.

Are we perhaps seeing the aliasing effects of quantized dimensions?

For example, when a photon is in one stable moment of time, we 'see' it as a photon. Whilst transitioning to the next stable moment, we then 'see' it as an electron-positron pair until they both coalesce into the same point in time and space as a photon oncemore... Or?

We could invoke the Feynman diagrams and include transitory pink elephants!

Fine on the abstractions that give very good statistical predictions. But what might the 'classical' physical mechanisms be? Is causality really a probability distribution where anything can be possible if rare enough?

We had for many years the 'controversy' of unexpectedly low neutrino counts from our sun until the discovery that the neutrinos oscillate through different types of neutrino... Is there any evidence for or against the photon similarly perpetually oscillating between positron-electron and photon?

I recall reading that the neutrinos only oscillate in the vicinity of other matter (which I thought sounded strange/interesting), and from reading elsewhere recently I learned that the primary interaction between a high-energy photon (like a gamma ray) and an atom is usually by the photon first changing to an electron/positron pair (rather than the photon being absorbed by one of the atom's electrons). It's like everything is waves (or energy thereof) except where the waves mix/interact, which then occurs as a limited number of distinctly recognizable particles â€“ so maybe particles are just peaks/valleys in the interaction of waves where some threshold value determines the transition from wave energy to point-like concentrations of mass/charge/color ... ?

We had for many years the 'controversy' of unexpectedly low neutrino counts from our sun until the discovery that the neutrinos oscillate through different types of neutrino... Is there any evidence for or against the photon similarly perpetually oscillating between positron-electron and photon?

I recall reading that the neutrinos only oscillate in the vicinity of other matter (which I thought sounded strange/interesting), and from reading elsewhere recently I learned that the primary interaction between a high-energy photon (like a gamma ray) and an atom is usually by the photon first changing to an electron/positron pair (rather than the photon being absorbed by one of the atom's electrons).

Ouch! Now my head does really hurt... But that does give an interesting angle for leverage...

Quote:

It's like everything is waves (or energy thereof) except where the waves mix/interact, which then occurs as a limited number of distinctly recognizable particles â€“ so maybe particles are just peaks/valleys in the interaction of waves where some threshold value determines the transition from wave energy to point-like concentrations of mass/charge/color ... ?

Could multiple photons be forced to coalesce/combine in a constricted waveguide for example? Would multiple electron-positron pairs be simultaneously released?

Are there any two 'particles' that have exactly the same energy but different characteristics?

... I've played with prompt gamma spectroscopy and thought that it was rather too convenient that energy levels uniquely identify all atomic elements... (Even if carbon is a little 'difficult' ;-) )

Also consider that each atom in our universe gravitationally attracts all other atoms in our universe... And does so apparently 'instantaneously' (present location rather than a light-speed retarded location).

However, that is not so for the other forces that are shielded by other matter...

Could multiple photons be forced to coalesce/combine in a constricted waveguide for example? Would multiple electron-positron pairs be simultaneously released?

Tullio mentioned earlier that the exclusion principle doesn't apply to photons since they're bosons, and hence can occupy the same cell in phase space, which suggests that they can pass through one another without combining. I don't think a laser would function the way it does if this weren't true. And isn't there a preferential probability for bosons to occupy the same space?

Tullio also mentioned that it's an 'energetic' photon that can turn into an e-/p+ pair (I had failed to make the distinction regarding a photon that's less energetic), and so now I'm wondering about things like spin/rotation and frequency, like are the e-/p+ going round each other proportional to the frequency of the photon? â€“ and another thing â€“ what keeps an electron from colliding with a proton and why doesn't that also keep an electron from colliding with a positron?

Quote:

Also consider that each atom in our universe gravitationally attracts all other atoms in our universe... And does so apparently 'instantaneously' (present location rather than a light-speed retarded location).

What makes you think that? If it did then the present locations (as seen with light) would be affected by the future positions (future relative to light travel-time, or relative to present locations determined using light) â€“ I don't think things would move the way they're observed to move, but I may not have thought it through properly ....

Could multiple photons be forced to coalesce/combine in a constricted waveguide for example? Would multiple electron-positron pairs be simultaneously released?

Tullio mentioned earlier that the exclusion principle doesn't apply to photons since they're bosons, and hence can occupy the same cell in phase space, which suggests that they can pass through one another without combining. I don't think a laser would function the way it does if this weren't true. And isn't there a preferential probability for bosons to occupy the same space?

Tullio also mentioned that it's an 'energetic' photon that can turn into an e-/p+ pair (I had failed to make the distinction regarding a photon that's less energetic), and so now I'm wondering about things like spin/rotation and frequency, like are the e-/p+ going round each other proportional to the frequency of the photon? â€“ and another thing â€“ what keeps an electron from colliding with a proton and why doesn't that also keep an electron from colliding with a positron?

Yes on all counts, and I'm wondering why also.

Might space-time have energy level dependant (non-linear?) properties? As you suggest, we may just be 'imagining' 'particles' for where there are certain confluences of energy poking above a threshold level... Or perhaps just distortions in the 'space-time fabric'...?

And yet the 'ether' was disproved long ago...

What is it that physically makes the distinction between a fermion(ic action) and boson(ic action)?

Quote:

Quote:

Also consider that each atom in our universe gravitationally attracts all other atoms in our universe... And does so apparently 'instantaneously' (present location rather than a light-speed retarded location).

What makes you think that? If it did then the present locations (as seen with light) would be affected by the future positions (future relative to light travel-time, or relative to present locations determined using light) â€“ I don't think things would move the way they're observed to move, but I may not have thought it through properly ....

Think further...

If you simulate orbits using the light retarded positions for gravitational attraction for the orbiting bodies in for example the solar system, the solar system soon flies apart. The effect is even more pronounced for simulating a galaxy. To make it all 'work' as we observe, you must assume instantaneous positions for all the bodies for the action of gravitational attraction. (There is more to the story ;-) )

Rather than an individual photon 'instantaneously exploring' all possible paths, can not the photon energy be simply following a potential gradient? Just as for gravitational fields, the "photon potential field" is 'already there' and so there is then no requirement for superluminal 'probing' of all possible paths.

Fair point, actually the photon is modelled as a quantized packet of the EM field, but that's not quite what you mean ie. what governs it's evolution? Why does it go along the "shortest" distance between two points ( suitably defined in relativistic terms )? Now Feynman observed that a Principle of Least Action applies. Originally 'action' and it's minimisation was a classical idea from Fermat and others.

Planck's constant has the dimensions of 'action'. Action can be defined as a quantity that varies from point to point along a path, and thus be summed/integrated from the start of the path to the finish. So if you have a set of possible paths between two fixed end points, then you can calculate how this action integral varies depending on the path chosen. Find which path(s) has the least value for it's action integral and that is the one(s) that occurs ( or is most probable )! Hey, the Price Is Right! :-)

In classical ray tracing optics this devolves to light going along the path which takes the least time. Indeed one can deduce a lot of light behaviour just using that - like Snell's law, reflection laws and such. This is a deep property not fully understood.

As for neutrino "oscillations", there are subtle points here. It seems to be the case that a source of neutrinos close by will give a certain mix of detectable neutrino types, compared to positions further away where the mixture is different. By mix I mean the sampled counts of neutrino types expressed as proportions of the total detected.

One explanation of neutrino mixing is framed in QM terms as there being three underlying states of the neutrino wave function. These underlying states do not individually equate to the observed types, and themselves cannot be observed alone. However one certain mix of these states yields a 'pure' electron neutrino function so that all detections will be 100% electron neutrinos, and another mix will lead to all detections being 100% muon neutrinos etc. So the proportions of observed neutrinos with distance from the source is now explained as differentials in the propagation of those unobserved base states which when summed yield varying amounts of each electron/muon/tau neutrino count. Whew! :-)

I'll come back later to the other points....

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

If you simulate orbits using the light retarded positions for gravitational attraction for the orbiting bodies in for example the solar system, the solar system soon flies apart. The effect is even more pronounced for simulating a galaxy. To make it all 'work' as we observe, you must assume instantaneous positions for all the bodies for the action of gravitational attraction. (There is more to the story ;-) )

Hmm ... my first line of reasoning was, considering that we observe most portions of the universe as they were millions to billions of years ago, then to use position values for where everything is presently (I guessed) means there would be perturbations observed from objects that don't appear to be there (yet). It's easy to become confused that way ... I'm not sure about the simulations you mentioned (or others) with regard to calculations and coding, but here's a good one (tho' it's only a little over a minute long): Galaxy Collision: Simulations vs Observations. It sure would be nice to know what was used, Newtonian mechanics or Einstein's general relativity. It's hard to believe that a simulation properly coded with the latter would fail to work.

I got thoroughly confused on my second line of reasoning, being if the force of gravity is instantaneous, how could you tell otherwise from observation? I think you'd have to test general relativity in as many different ways as possible ... here's a quote from Feynman's The Character of Physical Law (p.26):

Quote:

To finish about the theory of gravitation, I must say two more things. One is that Einstein had to modify the Laws of Gravitation in accordance with his principles of relativity. The first of the principles was that 'x' cannot occur instantaneously, while Newton's theory said that the force was instantaneous. He had to modify Newton's laws. They have very small effects, these modifications. One of them is that all masses fall, light has energy and energy is equivalent to mass. So light falls and it means that light going near the sun is deflected; it is. Also the force of gravitation is slightly modified in Einstein's theory, so that the law has changed very very slightly, and it is just the right amount to account for the slight discrepancy that was found in the movement of Mercury.

Hi, Mike, definitely looking forward to what more you have to say, especially anything about the uncertainty principle and e-/p+ collisions :)

For example, when a photon is in one stable moment of time, we 'see' it as a photon. Whilst transitioning to the next stable moment, we then 'see' it as an electron-positron pair until they both coalesce into the same point in time and space as a photon oncemore... Or?

We could invoke the Feynman diagrams and include transitory pink elephants!

Fine on the abstractions that give very good statistical predictions. But what might the 'classical' physical mechanisms be? Is causality really a probability distribution where anything can be possible if rare enough?

There's a thing called 'coupling'. It's what determines the general trend of the diagrams, the probabilities especially. If you have a particle with a non-zero electric charge then it will interact with electromagnetic fields, specifically the photons. No charge implies no coupling. At each vertex ( junction of lines ) of a diagram a factor ( coupling constant ) will enter the probability calculation that the diagram represents.

For EM calculations this constant is somewhat less than one. More complicated diagrams ( with more vertices ) have higher powers of this constant multiplying into the result. Progressively higher powers of a number less than one become smaller, and rapidly so if rather less than one. Compare how many powers of 0.95 it takes to get below 0.2, with how many powers it takes to get 0.1 below one millionth. The point is that the total probability for some result is going to be derived from the sum of the diagrams involved, and this is going to depend on the degree or level of complexity of the diagrams considered.

So if we label diagrams with no vertices as zero-eth degree, with one vertex as first degree, two vertices as second degree ..... generally n vertices as n-th degree, then the coupling constant comes into play with 0, 1, 2 ..... n powers. As there is no limit to diagrammatic complexity, the listing of all types is only limited by exhaustion, then some fancy-pants math comes along to help. This is the summing of infinite series, or equivalently, finding the limit ( if any ) of a finite series sum as the number of terms rises without bound. One key point in whether this converges ( gives a finite answer ) depends on that coupling constant. Whether in practice your calculation can spit out a useful, testable number to some precision depends on how many diagram degrees ( vertex count ) you include. For EM the convergence is rapid as the 'fine structure constant' is ~ 1/137. For QCD ( strong force, quarks et al ) it's much larger, and while still less than one, so it will converge to a reasonable result but only after considering the sum total of diagrams with very many degrees. A supercomputer job.

Note that nearly all the diagrams refer to virtual behaviours, with these transient creations and destructions of particles not ever being detected. What is recorded is event counts after repeated trials, from which is deduced a probability distribution, of 'final' events. So you spit out a photon from some source and never see it again until some later detection, but we model it along the way by this diagrammatic/mathematic process above.

I would conceptualise/visualise an electron, say, as being surrounded by a bubbling bushy 'beard' of virtual particles - with the simpler virtual behaviours close in with heavy highlighting and the more complex ones fading away with progressively lighter emphasis.

What prevents us from chucking this whole program of thinking and calculation out as silly/bizarre/counter-intuitive is it's astounding success in prediction. There are no competing theories even vaguely near it's experimental accuracy when compared head to head. Many would love to do so for string theory, but it has yet to produce a single number to be tested. One key issue for string theory is the coupling constants used are often close to ( or even greater than ) one.

Cheers, Mike.

( edit ) The anti-intuitive 'horror' of QM bites us all eventually .... :-)

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

I got thoroughly confused on my second line of reasoning, being if the force of gravity is instantaneous, how could you tell otherwise from observation?. ...

The propagation of sound is slow enough to be rediscovered by school-children experiments in a large enough playground.

Light took a while longer to discover that it propagates at a finite measurable speed.

And then there is gravity.

For the sun - earth example, we see the sun where it was about 8.5 minutes ago yet the line of acceleration of the earth goes through where the sun is now. At any moment, where we visually see the sun is at an angle away from the direction of acceleration towards the (instantaneous position of the) sun.

If the action of gravity was not on a line directly between the sun and earth, there would then be an off-centre coupling action that would cause a gain in angular momentum... And we all fly away...

Quote:

Hi, Mike, definitely looking forward to what more you have to say, especially anything about the uncertainty principle and e-/p+ collisions :)

Fine on the Feynman diagrams and the probability series.

There's still the puzzle of what might be 'inbetween'...

... not sure about the simulations you mentioned (or others) with regard to calculations and coding, but here's a good one (tho' it's only a little over a minute long): Galaxy Collision: Simulations vs Observations. It sure would be nice to know what was used, Newtonian mechanics or Einstein's general relativity. It's hard to believe that a simulation properly coded with the latter would fail to work.

It would be very interesting to see what assumptions they have made and what was actually coded into the simulation...

I've yet to finish bashing out the trig for my reworking of the sun - earth system...

I would like to guess that the same effects acting there may well be applicable to the electron/photons and Young's Slits example...

## RE: Ah, well now we're into

)

Is that not the nature of physics and Truth?

Is this not where we invoke the Heisenberg (uncertainty) principle to predict the limits of what we can presently hope to detect with our presently available tools?

Indeed so, unless we can reliably and repeatedly intercept the photon (or multiple identical photons) at sample points between A and B. Hence such as the Young's slits experiment and variations.

Some suspicions for that example:

Rather than an individual photon 'instantaneously exploring' all possible paths, can not the photon energy be simply following a potential gradient? Just as for gravitational fields, the "photon potential field" is 'already there' and so there is then no requirement for superluminal 'probing' of all possible paths;

We had for many years the 'controversy' of unexpectedly low neutrino counts from our sun until the discovery that the neutrinos oscillate through different types of neutrino... Is there any evidence for or against the photon similarly perpetually oscillating between positron-electron and photon?

Yes, and that's a fascinating aspect.

Are we perhaps seeing the aliasing effects of quantized dimensions?

For example, when a photon is in one stable moment of time, we 'see' it as a photon. Whilst transitioning to the next stable moment, we then 'see' it as an electron-positron pair until they both coalesce into the same point in time and space as a photon oncemore... Or?

We could invoke the Feynman diagrams and include transitory pink elephants!

Fine on the abstractions that give very good statistical predictions. But what might the 'classical' physical mechanisms be? Is causality really a probability distribution where anything can be possible if rare enough?

Cheers,

Martin

See new freedom: Mageia Linux

Take a look for yourself: Linux Format

The Future is what We all make IT (GPLv3)

## RE: We had for many years

)

I recall reading that the neutrinos only oscillate in the vicinity of other matter (which I thought sounded strange/interesting), and from reading elsewhere recently I learned that the primary interaction between a high-energy photon (like a gamma ray) and an atom is usually by the photon first changing to an electron/positron pair (rather than the photon being absorbed by one of the atom's electrons). It's like everything is waves (or energy thereof) except where the waves mix/interact, which then occurs as a limited number of distinctly recognizable particles â€“ so maybe particles are just peaks/valleys in the interaction of waves where some threshold value determines the transition from wave energy to point-like concentrations of mass/charge/color ... ?

## RE: RE: We had for many

)

Ouch! Now my head does really hurt... But that does give an interesting angle for leverage...

Could multiple photons be forced to coalesce/combine in a constricted waveguide for example? Would multiple electron-positron pairs be simultaneously released?

Are there any two 'particles' that have exactly the same energy but different characteristics?

... I've played with prompt gamma spectroscopy and thought that it was rather too convenient that energy levels uniquely identify all atomic elements... (Even if carbon is a little 'difficult' ;-) )

Also consider that each atom in our universe gravitationally attracts all other atoms in our universe... And does so apparently 'instantaneously' (present location rather than a light-speed retarded location).

However, that is not so for the other forces that are shielded by other matter...

...?

Cheers,

Martin

See new freedom: Mageia Linux

Take a look for yourself: Linux Format

The Future is what We all make IT (GPLv3)

## RE: Could multiple photons

)

Tullio mentioned earlier that the exclusion principle doesn't apply to photons since they're bosons, and hence can occupy the same cell in phase space, which suggests that they can pass through one another without combining. I don't think a laser would function the way it does if this weren't true. And isn't there a preferential probability for bosons to occupy the same space?

Tullio also mentioned that it's an 'energetic' photon that can turn into an e-/p+ pair (I had failed to make the distinction regarding a photon that's less energetic), and so now I'm wondering about things like spin/rotation and frequency, like are the e-/p+ going round each other proportional to the frequency of the photon? â€“ and another thing â€“ what keeps an electron from colliding with a proton and why doesn't that also keep an electron from colliding with a positron?

What makes you think that? If it did then the present locations (as seen with light) would be affected by the future positions (future relative to light travel-time, or relative to present locations determined using light) â€“ I don't think things would move the way they're observed to move, but I may not have thought it through properly ....

## RE: RE: Could multiple

)

Yes on all counts, and I'm wondering why also.

Might space-time have energy level dependant (non-linear?) properties? As you suggest, we may just be 'imagining' 'particles' for where there are certain confluences of energy poking above a threshold level... Or perhaps just distortions in the 'space-time fabric'...?

And yet the 'ether' was disproved long ago...

What is it that physically makes the distinction between a fermion(ic action) and boson(ic action)?

Think further...

If you simulate orbits using the light retarded positions for gravitational attraction for the orbiting bodies in for example the solar system, the solar system soon flies apart. The effect is even more pronounced for simulating a galaxy. To make it all 'work' as we observe, you must assume instantaneous positions for all the bodies for the action of gravitational attraction. (There is more to the story ;-) )

Cheers,

Martin

See new freedom: Mageia Linux

Take a look for yourself: Linux Format

The Future is what We all make IT (GPLv3)

## Hi guys!

)

Hi guys! :-)

Fair point, actually the photon is modelled as a quantized packet of the EM field, but that's not quite what you mean ie. what governs it's evolution? Why does it go along the "shortest" distance between two points ( suitably defined in relativistic terms )? Now Feynman observed that a Principle of Least Action applies. Originally 'action' and it's minimisation was a classical idea from Fermat and others.

Planck's constant has the dimensions of 'action'. Action can be defined as a quantity that varies from point to point along a path, and thus be summed/integrated from the start of the path to the finish. So if you have a set of possible paths between two fixed end points, then you can calculate how this action integral varies depending on the path chosen. Find which path(s) has the least value for it's action integral and that is the one(s) that occurs ( or is most probable )! Hey, the Price Is Right! :-)

In classical ray tracing optics this devolves to light going along the path which takes the least time. Indeed one can deduce a lot of light behaviour just using that - like Snell's law, reflection laws and such. This is a deep property not fully understood.

As for neutrino "oscillations", there are subtle points here. It seems to be the case that a source of neutrinos close by will give a certain mix of detectable neutrino types, compared to positions further away where the mixture is different. By mix I mean the sampled counts of neutrino types expressed as proportions of the total detected.

One explanation of neutrino mixing is framed in QM terms as there being three underlying states of the neutrino wave function. These underlying states do not individually equate to the observed types, and themselves cannot be observed alone. However one certain mix of these states yields a 'pure' electron neutrino function so that all detections will be 100% electron neutrinos, and another mix will lead to all detections being 100% muon neutrinos etc. So the proportions of observed neutrinos with distance from the source is now explained as differentials in the propagation of those unobserved base states which when summed yield varying amounts of each electron/muon/tau neutrino count. Whew! :-)

I'll come back later to the other points....

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## RE: If you simulate orbits

)

Hmm ... my first line of reasoning was, considering that we observe most portions of the universe as they were millions to billions of years ago, then to use position values for where everything is presently (I guessed) means there would be perturbations observed from objects that don't appear to be there (yet). It's easy to become confused that way ... I'm not sure about the simulations you mentioned (or others) with regard to calculations and coding, but here's a good one (tho' it's only a little over a minute long): Galaxy Collision: Simulations vs Observations. It sure would be nice to know what was used, Newtonian mechanics or Einstein's general relativity. It's hard to believe that a simulation properly coded with the latter would fail to work.

I got thoroughly confused on my second line of reasoning, being if the force of gravity is instantaneous, how could you tell otherwise from observation? I think you'd have to test general relativity in as many different ways as possible ... here's a quote from Feynman's The Character of Physical Law (p.26):

Hi, Mike, definitely looking forward to what more you have to say, especially anything about the uncertainty principle and e-/p+ collisions :)

## RE: For example, when a

)

There's a thing called 'coupling'. It's what determines the general trend of the diagrams, the probabilities especially. If you have a particle with a non-zero electric charge then it will interact with electromagnetic fields, specifically the photons. No charge implies no coupling. At each vertex ( junction of lines ) of a diagram a factor ( coupling constant ) will enter the probability calculation that the diagram represents.

For EM calculations this constant is somewhat less than one. More complicated diagrams ( with more vertices ) have higher powers of this constant multiplying into the result. Progressively higher powers of a number less than one become smaller, and rapidly so if rather less than one. Compare how many powers of 0.95 it takes to get below 0.2, with how many powers it takes to get 0.1 below one millionth. The point is that the total probability for some result is going to be derived from the sum of the diagrams involved, and this is going to depend on the degree or level of complexity of the diagrams considered.

So if we label diagrams with no vertices as zero-eth degree, with one vertex as first degree, two vertices as second degree ..... generally n vertices as n-th degree, then the coupling constant comes into play with 0, 1, 2 ..... n powers. As there is no limit to diagrammatic complexity, the listing of all types is only limited by exhaustion, then some fancy-pants math comes along to help. This is the summing of infinite series, or equivalently, finding the limit ( if any ) of a finite series sum as the number of terms rises without bound. One key point in whether this converges ( gives a finite answer ) depends on that coupling constant. Whether in practice your calculation can spit out a useful, testable number to some precision depends on how many diagram degrees ( vertex count ) you include. For EM the convergence is rapid as the 'fine structure constant' is ~ 1/137. For QCD ( strong force, quarks et al ) it's much larger, and while still less than one, so it will converge to a reasonable result but only after considering the sum total of diagrams with very many degrees. A supercomputer job.

Note that nearly all the diagrams refer to virtual behaviours, with these transient creations and destructions of particles not ever being detected. What is recorded is event counts after repeated trials, from which is deduced a probability distribution, of 'final' events. So you spit out a photon from some source and never see it again until some later detection, but we model it along the way by this diagrammatic/mathematic process above.

I would conceptualise/visualise an electron, say, as being surrounded by a bubbling bushy 'beard' of virtual particles - with the simpler virtual behaviours close in with heavy highlighting and the more complex ones fading away with progressively lighter emphasis.

What prevents us from chucking this whole program of thinking and calculation out as silly/bizarre/counter-intuitive is it's astounding success in prediction. There are no competing theories even vaguely near it's experimental accuracy when compared head to head. Many would love to do so for string theory, but it has yet to produce a single number to be tested. One key issue for string theory is the coupling constants used are often close to ( or even greater than ) one.

Cheers, Mike.

( edit ) The anti-intuitive 'horror' of QM bites us all eventually .... :-)

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## Brief

)

Brief comment...

The propagation of sound is slow enough to be rediscovered by school-children experiments in a large enough playground.

Light took a while longer to discover that it propagates at a finite measurable speed.

And then there is gravity.

For the sun - earth example, we see the sun where it was about 8.5 minutes ago yet the line of acceleration of the earth goes through where the sun is now. At any moment, where we visually see the sun is at an angle away from the direction of acceleration towards the (instantaneous position of the) sun.

If the action of gravity was not on a line directly between the sun and earth, there would then be an off-centre coupling action that would cause a gain in angular momentum... And we all fly away...

Fine on the Feynman diagrams and the probability series.

There's still the puzzle of what might be 'inbetween'...

Further thoughts in progress pending Heisenberg!

Cheers,

Martin

Take a look for yourself: Linux Format

The Future is what We all make IT (GPLv3)

## RE: ... not sure about the

)

It would be very interesting to see what assumptions they have made and what was actually coded into the simulation...

I've yet to finish bashing out the trig for my reworking of the sun - earth system...

I would like to guess that the same effects acting there may well be applicable to the electron/photons and Young's Slits example...

Cheers,

Martin

Take a look for yourself: Linux Format

The Future is what We all make IT (GPLv3)