One thought... Has time itself been found to have a quanta (minimum time step)?

Possibly. It's called Planck time and is 5.391â€‰24(27) Ã— 10âˆ’44 seconds. By some theories it is the smallest possible timestep. If memory serves it also is the amount of time it takes light to cross a protron.

Following the link gives:

The Planck time is simply the time it takes a beam of light to travel a Planck length. As of 2006, the smallest unit of time that has been directly measured is on the attosecond (10^âˆ’18 s) time scale, or around 10^26 Planck times.

... So what does that mean other than a mathematical contrivance?

Why should the Planck time suggest a "time-step" or granularity in the flow of time?

Can we test for aliasing effects?

Might quantum superposition be a manifestation of quantum time aliasing?...

Does that mean that we suffer quantum causality?...

A proton is of the order of 10^[-15]m. Speed of light ~ 3 x 10^8 m/s.

time = distance / velocity = 10^[-15] / 3 x 10^8 ~ 3 x 10^[-23] secs

There is no experimental evidence yet of quantization in time, distance or direction ( anisotropy ). Not even indirect suggestions. I forget the lower limits probed on these but they are far and above the Planck dimensions - which as said are defined in theory. It's only the Planck constant ( the 'h' in E = hv or p = hk, where E = energy, v = frequency, p = momentum, k = wave number/vector ) that has an experimental determination. Strictly speaking even the other Planck variables don't imply granularity per se of these quantities, but that is the presumed lower limit of experimental definition/accuracy that spacetime imposes. Subtle point perhaps, but the calculations at that level still assume continuous variables, whereas talk of granularity implies discrete arithmetic.

The "Correspondence Principle" states that QM calculations should merge with classical predictions as h -> 0. Or alternatively the same convergence stated as one proceeds to progressively higher quantum numbers, which is our human scale. So the energy difference between states of quantum number n = 1000000 to n = 1000001, say, are going to challenge any experimental setup to see as distinct. The energy levels of the single electron going about a single proton ( hydrogen ) go like n^[-2], from -13.6 eV up toward 0 above which it escapes to the ionization continuum. Meaning you can get arbitrarily close to ionization by a sufficiently high value of n ( and it's inverse quadratic ).

If there was really this ultra low level stepped lattice type arrangement then a similiar view would apply. So whatever prediction based upon granularity ought merge with higher scale theories as Planck distance -> 0, say. Or Planck time -> 0 or whatever you were looking at.

What is this aliasing idea? I'm actually Neo but I look like Morpheus? :-)

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter.Blaise Pascal

Why does this thread always take at least two cups of coffee even thought the words and numbers may be few?!...

[/rhetorically]

Quote:

A proton is of the order of 10^[-15]m. Speed of light ~ 3 x 10^8 m/s.

time = distance / velocity = 10^[-15] / 3 x 10^8 ~ 3 x 10^[-23] secs

There is no experimental evidence yet of quantization in time, distance or direction ( anisotropy ). Not even indirect suggestions. I forget the lower limits probed on these but they are far and above the Planck dimensions...

Strictly speaking even the other Planck variables don't imply granularity per se of these quantities, but that is the presumed lower limit of experimental definition/accuracy that spacetime imposes. Subtle point perhaps, but the calculations at that level still assume continuous variables, whereas talk of granularity implies discrete arithmetic.

The subtlety is whether we see discrete effects or not due to the limits of our observations and theories (or even whether as a consequence of the mathematical tools used), or whether our universe physically is continuous or discrete for whatever dimensions and scales.

Is QM nothing more than a 'nice' mathematical statistical abstraction that nicely (and numerically accurately) describes ensembles of results but says nothing of the underlaying mechanisms?

Is there a dice being rolled or a 'God-clock' that 'ticks', or is there a smooth continuous shift of state throughout time and space?

Or is the state of our universe a series of points of stability that loiter for a moment before coalescing into the next (discrete?) moment?

Can that be experimentally tested in the first place being as we are ourselves and our experiment a part of that state that is under test?

Quote:

The "Correspondence Principle" states that QM calculations should merge with classical predictions as h -> 0. ... difference between states of quantum number n = 1000000 to n = 1000001, say, are going to challenge any experimental setup to see as distinct...

If there was really this ultra low level stepped lattice type arrangement then a similiar view would apply. So whatever prediction based upon granularity ought merge with higher scale theories as Planck distance -> 0, say. Or Planck time -> 0 or whatever you were looking at.

Indeed so, which comes then to how to see some sort of "interference effect" to then hope to probe any such granularity whilst working from a much higher physical scale.

Perhaps we already see some of the effects with the wave-particle duality and interference effects (Young's slits) seen even when there should only (on average) be only one 'particle' in the experiment at any one time. Are we seeing "jiggles" on a discrete lattice, a little like balls bouncing through a bagatelle?

(Note, not an analogy with the very different and larger scale Brownian motion.)

Aside:

Or do wave packets extend their influence far wider than imagined, out to many wavelengths, and the 'interference patterns' 'seen' on whatever target detector are merely an artifact of effects that independently determine the path taken for each photon individually? (No 'duality' needed physically. The 'duality' is all in our own heads only!)

Or...?

Quote:

What is this aliasing idea? I'm actually Neo but I look like Morpheus? :-)

Could be... There is always the Agent(s) Smith trick...

If we were ourselves part of a digital computer hosted simulation, would we be able to determine the simulation's discrete timestep? What if the hosting was on an analog computer?

The aliasing I'm describing is analogous to that seen when you digitally sample a continuous waveform at too low a sample rate. Go below the Nyquist (-Shannon) rate and you see various 'manifestations' rather than what is originally 'there'.

Hence, might such effects in physical experiments be 'seen' as superposition and duality? Might 'entanglement' be a phase-shifted or phase-aligned aliasing effect 'seen' between synchronized states?...

Hence, might such effects in physical experiments be 'seen' as superposition and duality? Might 'entanglement' be a phase-shifted or phase-aligned aliasing effect 'seen' between synchronized states?...

Tempting to think so, but when performing Young's double-slit experiment, the interference pattern (from which is inferred the non-locality, superposition and/or duality) is obtained at the detectors only with a sampling rate of exactly zero at the slits. Arrg!

I was thinking the relative distance in the direction of propagation for anything moving at the speed of light is zero â€“ there's the non-locality (how fast does a wavefunction propagate?), and what about the 'pilot wave' (or Huygens principle?) for the observed interference effects? (from Bohm's construction of a hidden variables theory) ? I dunno ... :)

... when performing Young's double-slit experiment, the interference pattern (from which is inferred the non-locality, superposition and/or duality) is obtained at the detectors only with a sampling rate of exactly zero at the slits. Arrg!

I was thinking the relative distance in the direction of propagation for anything moving at the speed of light is zero â€“ there's the non-locality (how fast does a wavefunction propagate?), and what about the 'pilot wave' (or Huygens principle?) for the observed interference effects? (from Bohm's construction of a hidden variables theory) ? I dunno ... :)

There's lot's of reading there and also on the refresher for Young's Slits. An interesting aspect is that you get hyperbolic fringes from using pin-holes instead.

Are we just not suffering a silly 'hangup' about waves, scale, and particles? A small enough lump of energy will 'look like' a hard edged particle until you look at a high enough magnification to then see a (Gaussian density distribution?) fuzzy cloud...

Perhaps electrons and photons always have an extended field around them that will couple with the two slits simultaneously for each electron/photon? You then get a refraction effect depending on the phase of oscillation to get through both slits simultaneously. Hence, a proportion of energy worms through each of both slits for each electron/photon (hence 'interfering' with itself via both slits).

So...

Must the electrons be 'coherent' for the Young's slit interference to be seen?

Will the slit interference occur regardless of incident beam width (for example, for very broad beams and widely separated slits)?

I wonder if there is a parallel with the way gravitational fields operate...? (And yes, I'm still musing on that one also!)

... but when performing Young's double-slit experiment, the interference pattern (from which is inferred the non-locality, superposition and/or duality) is obtained at the detectors only with a sampling rate of exactly zero at the slits. Arrg! ...

To 'observe' an energy packet negotiating the slits requires some sort of interaction with the process at the slits. Otherwise, you can't have any 'detection' for your observation.

Which leads onto the question:

Could the Young's Slits experiment be done under conditions whereby you can indirectly detect the processes at the slits? For example, use high speed 'particles' in a low speed medium and observe the pattern of Cherenkov radiation? Or even take advantage of the Smith-Purcell Effect?

Anyone got an optical bench to hand to have a play??!

Does the interference pattern change if the three metal plates forming the Young's Slits are charged to different electric potentials? (For electrons or photons?)

Use SQUIDs at the slits/plates?...

And now a really nasty question...

Can you (re)create a coherent laser beam by reflecting a Young's Slits interference pattern onto an identical second Young's Slits?

Or indeed by introducing a convex lens to focus back onto an identical second Young's Slits?

I think there is more to this story yet...

And then there is all the "near field" tricks whereby you can resolve to less than the wavelength of the light being used... SNOM anyone?

Despite the Young's Slits digression, there is still:

Quote:

[...]

Quote:

What is this aliasing idea? I'm actually Neo but I look like Morpheus? :-)

Could be... There is always the Agent(s) Smith trick...

If we were ourselves part of a digital computer hosted simulation, would we be able to determine the simulation's discrete timestep? What if the hosting was on an analog computer?

The aliasing I'm describing is analogous to that seen when you digitally sample a continuous waveform at too low a sample rate. Go below the Nyquist (-Shannon) rate and you see various 'manifestations' rather than what is originally 'there'.

Hence, might such effects in physical experiments be 'seen' as superposition and duality? Might 'entanglement' be a phase-shifted or phase-aligned aliasing effect 'seen' between synchronized states?...

I sort of thought it was meant in the sense of ( anti- ) aliasing for graphics display, where you try to fit a continuous curve to a discrete grid. I recall a long time ago doing assembler code/algorithms for fast curve drawing, and how the key nexus was how to decide which direction to jump to the next pixel. If one continually wipes & re-draws as the endpoints are progressively translated across the 2D canvas, then the curve gets this curious 'wiggly' quality as it wafts across the hidden 'boundaries' in the algorithm. So rounding issues are declared for instance, if exactly midway then do you round up or down, etc ..... glitches in the Matrix eh? :-)

For the wave/particle duality the hangup isn't particularly about the margins of the 'entity'. It's that certain circumstances can really confuse about extents. You can have the distal screen of a Young's double slit arrangement replaced by a photo-multiplier ( PM ), say, with some narrow 'throat'. You expect, a priori, that when it clicks off ( slow/low intensity, ~ one photon in the gadget at a time ) then that instance which we interpret as a single photon reception more or less confines our thinking as the photon at detection having dimensions of the order of that throat. So you waltz the PM tube up and down the target plane and accumulate counts, all the time only getting single or no clicks. All clicks the same. Now you can compare counts with one or other slit covered - the separation between them now many times the PM 'width' - yielding significant differences in the pattern. So one assumes that the photon prior to detection seems to have dimensions of the order of the slit separation. To really blow you over, some parts of the pattern experience an increase in flux/counts when one of the slits is closed! What The What? :-)

You can, as alluded to, jazz up the circumstances by putting in 'path detectors' of some sort at each slit and try to map the movements. These generally won't catch all passer's by, and interestingly one can then divide the statistics into two groups. The first are those where a target plane PM is activated without anything at a slit - this subgroup will show a typical 'wave' interference pattern. The second is where a target plane event coincides ( very nearly ) with one of the slit events. Overall these will exhibit a smeared pattern without interference effects. If you then divide this second group into those coincidences with one slit vs the other they show identical, but displaced patterns which when added give that overall smearing. That is discrete independent counts, as opposed to the first ( interfering ) case which you clearly need some 'phasing' related cross term, with possible subtraction, to get the counts to relate.

If you try to 'soften' the path detectors' bumping effects, then you can, and even return to an interference type pattern. But you can only do that in such a way, say by increasing some wavelength, with the effect that you lose sufficient resolution to determine unequivocally which slit was passed through.

So the upshot is that you can't get path information in those circumstances where interference occurs. You only ever receive discrete lumps! The duality is not wave or particle its really neither ..... :-) [ We associate paths with particles, and interference with waves. ]

There's no requirement for coherence of either photons or electrons except that they enter with pretty close energies/momenta/wavelengths so that interference effects are not overlapped/blurred ( 'monochromatic' ). There is no requirement for any phase relationship between successive or concurrent particles in the slit setup. That's the point - each particle seems to interfere with itself. This has been formalised as some sort of 'amplitude', or 'matter wave' etc, which takes all the paths and recombines with phase information at the point of detection ( the Feynman path/history integral* ).

Note all those photons passing through, some with target/slit coincidence and some without, were identically prepared. Some perchance got slit/path detection, some didn't. While travelling from the source/entry to the slit plane there was no difference that could be known/used to decide which group any given particle would be in. There is, with proper alignment, no difference in the expectation of which slit/path would it be detected at ( if at all ).

Cheers, Mike.

( edit ) * - the integration is over all possible/hypothetical spacetime paths, envisaged as a 'simultaneous' transit of some undetectable complex ( number field ) wave function. You add the contributions to get a complex number, the absolute value of which ( with a normalised denominator over the whole set of outcomes ) gives the probability of a particular outcome of interest. That addition effectively computes/compares the phase delays in the problem and hence results in enhanced or diminished values ( interference ). The absolute value of a complex number can be computed by multiplying it ( using complex number arithmetic rules ) by it's complex conjugate ( same real part, flip sign of imaginary part ). So that mathematically embodies the problem. If you path detect at some point ( near some slit ) then you, by definition, confine/reduce your path/history set to combine/integrate with. The total now excludes components with ( radically ) different phases, from the other slit, that would have appreciably subtracted. So no 'destructive interference' statistics, and ~ classical behaviour/predictions result.

I have made this letter longer than usual because I lack the time to make it shorter.Blaise Pascal

Take neutrons from a given source, say a reactor. Various energies/momenta/wavelengths. Have them head towards a lattice of some ( non-absorbing ) material that has a path/channel through it with a suitably different lattice spacing to the surrounding volume. If you make that pathway long enough then short-wavelength ( as per de Broglie's formula ) neutrons will not emerge from the other end of that path. The long wavelength ones will.

It is as if the neutrons only 'see' gaps/slits/arrays with spacing only with their wavelength or larger. They will ignore any grating gap smaller than their own wavelength, so for a given lattice/step distance : neutrons with a wavelength shorter than that will be diffracted/reflected etc and those with a longer wavelength won't. And yet from the distal end you can only ever detect single/whole neutrons!

Classically energetic neutrons ought be more-or-less as successful in transiting that channel as the sluggish ones ...... either way they'd bounce/diffuse through like balls in a pinball machine bouncing off the bumpers/pegs. However it actually acts like a low-pass ( frequency ) filter, more energy gives less penetration!. :-)

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter.Blaise Pascal

There's still a few red-hot chillies in there to chew through yet!...

Stepping back to the slits: For the photons example for self-interference, that suggests that the requirement is not for coherent light but instead just for monochromatic light. Is that the case?

At what point does the interference fail as the two slits are placed further apart?

Quote:

Take neutrons from a given source, say a reactor. Various energies/momenta/wavelengths. Have them head towards a lattice of some ( non-absorbing ) material that has a path/channel through it with a suitably different lattice spacing to the surrounding volume. If you make that pathway long enough then short-wavelength ( as per de Broglie's formula ) neutrons will not emerge from the other end of that path. The long wavelength ones will...

I would expect a similar effect with the Young's slits as you make the material for the slits thicker (greater depth for the slits).

An interesting question is:

Do you get a constant for "separation + depth" for the frequency cut off?...

## RE: RE: One thought...

)

Following the link gives:

The Planck time is simply the time it takes a beam of light to travel a Planck length. As of 2006, the smallest unit of time that has been directly measured is on the attosecond (10^âˆ’18 s) time scale, or around 10^26 Planck times.

... So what does that mean other than a mathematical contrivance?

Why should the Planck time suggest a "time-step" or granularity in the flow of time?

Can we test for aliasing effects?

Might quantum superposition be a manifestation of quantum time aliasing?...

Does that mean that we suffer quantum causality?...

That's certainly good pause-for-thought!

Regards,

Martin

See new freedom: Mageia Linux

Take a look for yourself: Linux Format

The Future is what We all make IT (GPLv3)

## A proton is of the order of

)

A proton is of the order of 10^[-15]m. Speed of light ~ 3 x 10^8 m/s.

time = distance / velocity = 10^[-15] / 3 x 10^8 ~ 3 x 10^[-23] secs

There is no experimental evidence yet of quantization in time, distance or direction ( anisotropy ). Not even indirect suggestions. I forget the lower limits probed on these but they are far and above the Planck dimensions - which as said are defined in theory. It's only the Planck constant ( the 'h' in E = hv or p = hk, where E = energy, v = frequency, p = momentum, k = wave number/vector ) that has an experimental determination. Strictly speaking even the other Planck variables don't imply granularity per se of these quantities, but that is the presumed lower limit of experimental definition/accuracy that spacetime imposes. Subtle point perhaps, but the calculations at that level still assume continuous variables, whereas talk of granularity implies discrete arithmetic.

The "Correspondence Principle" states that QM calculations should merge with classical predictions as h -> 0. Or alternatively the same convergence stated as one proceeds to progressively higher quantum numbers, which is our human scale. So the energy difference between states of quantum number n = 1000000 to n = 1000001, say, are going to challenge any experimental setup to see as distinct. The energy levels of the single electron going about a single proton ( hydrogen ) go like n^[-2], from -13.6 eV up toward 0 above which it escapes to the ionization continuum. Meaning you can get arbitrarily close to ionization by a sufficiently high value of n ( and it's inverse quadratic ).

If there was really this ultra low level stepped lattice type arrangement then a similiar view would apply. So whatever prediction based upon granularity ought merge with higher scale theories as Planck distance -> 0, say. Or Planck time -> 0 or whatever you were looking at.

What is this aliasing idea? I'm actually Neo but I look like Morpheus? :-)

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter. Blaise Pascal

## [rhetorically] Why does

)

[rhetorically]

Why does this thread always take at least two cups of coffee even thought the words and numbers may be few?!...

[/rhetorically]

The subtlety is whether we see discrete effects or not due to the limits of our observations and theories (or even whether as a consequence of the mathematical tools used), or whether our universe physically is continuous or discrete for whatever dimensions and scales.

Is QM nothing more than a 'nice' mathematical statistical abstraction that nicely (and numerically accurately) describes ensembles of results but says nothing of the underlaying mechanisms?

Is there a dice being rolled or a 'God-clock' that 'ticks', or is there a smooth continuous shift of state throughout time and space?

Or is the state of our universe a series of points of stability that loiter for a moment before coalescing into the next (discrete?) moment?

Can that be experimentally tested in the first place being as we are ourselves and our experiment a part of that state that is under test?

Indeed so, which comes then to how to see some sort of "interference effect" to then hope to probe any such granularity whilst working from a much higher physical scale.

Perhaps we already see some of the effects with the wave-particle duality and interference effects (Young's slits) seen even when there should only (on average) be only one 'particle' in the experiment at any one time. Are we seeing "jiggles" on a discrete lattice, a little like balls bouncing through a bagatelle?

(Note, not an analogy with the very different and larger scale Brownian motion.)

Aside:

Or do wave packets extend their influence far wider than imagined, out to many wavelengths, and the 'interference patterns' 'seen' on whatever target detector are merely an artifact of effects that independently determine the path taken for each photon individually? (No 'duality' needed physically. The 'duality' is all in our own heads only!)

Or...?

Could be... There is always the Agent(s) Smith trick...

If we were ourselves part of a digital computer hosted simulation, would we be able to determine the simulation's discrete timestep? What if the hosting was on an analog computer?

The aliasing I'm describing is analogous to that seen when you digitally sample a continuous waveform at too low a sample rate. Go below the Nyquist (-Shannon) rate and you see various 'manifestations' rather than what is originally 'there'.

Hence, might such effects in physical experiments be 'seen' as superposition and duality? Might 'entanglement' be a phase-shifted or phase-aligned aliasing effect 'seen' between synchronized states?...

Regards,

Martin

See new freedom: Mageia Linux

Take a look for yourself: Linux Format

The Future is what We all make IT (GPLv3)

## RE: Hence, might such

)

Tempting to think so, but when performing Young's double-slit experiment, the interference pattern (from which is inferred the non-locality, superposition and/or duality) is obtained at the detectors only with a sampling rate of exactly zero at the slits. Arrg!

I was thinking the relative distance in the direction of propagation for anything moving at the speed of light is zero â€“ there's the non-locality (how fast does a wavefunction propagate?), and what about the 'pilot wave' (or Huygens principle?) for the observed interference effects? (from Bohm's construction of a hidden variables theory) ? I dunno ... :)

## RE: ... when performing

)

There's lot's of reading there and also on the refresher for Young's Slits. An interesting aspect is that you get hyperbolic fringes from using pin-holes instead.

Are we just not suffering a silly 'hangup' about waves, scale, and particles? A small enough lump of energy will 'look like' a hard edged particle until you look at a high enough magnification to then see a (Gaussian density distribution?) fuzzy cloud...

Perhaps electrons and photons always have an extended field around them that will couple with the two slits simultaneously for each electron/photon? You then get a refraction effect depending on the phase of oscillation to get through both slits simultaneously. Hence, a proportion of energy worms through each of both slits for each electron/photon (hence 'interfering' with itself via both slits).

So...

Must the electrons be 'coherent' for the Young's slit interference to be seen?

Will the slit interference occur regardless of incident beam width (for example, for very broad beams and widely separated slits)?

I wonder if there is a parallel with the way gravitational fields operate...? (And yes, I'm still musing on that one also!)

Cheers,

Martin

See new freedom: Mageia Linux

Take a look for yourself: Linux Format

The Future is what We all make IT (GPLv3)

## RE: ... but when performing

)

To 'observe' an energy packet negotiating the slits requires some sort of interaction with the process at the slits. Otherwise, you can't have any 'detection' for your observation.

Which leads onto the question:

Could the Young's Slits experiment be done under conditions whereby you can indirectly detect the processes at the slits? For example, use high speed 'particles' in a low speed medium and observe the pattern of Cherenkov radiation? Or even take advantage of the Smith-Purcell Effect?

Anyone got an optical bench to hand to have a play??!

Does the interference pattern change if the three metal plates forming the Young's Slits are charged to different electric potentials? (For electrons or photons?)

Use SQUIDs at the slits/plates?...

And now a really nasty question...

Can you (re)create a coherent laser beam by reflecting a Young's Slits interference pattern onto an identical second Young's Slits?

Or indeed by introducing a convex lens to focus back onto an identical second Young's Slits?

I think there is more to this story yet...

And then there is all the "near field" tricks whereby you can resolve to less than the wavelength of the light being used... SNOM anyone?

Regards,

Martin

SNOM: Scanning Near-Field Optical Microscope

Take a look for yourself: Linux Format

The Future is what We all make IT (GPLv3)

## Despite the Young's Slits

)

Despite the Young's Slits digression, there is still:

Regards,

Martin

Take a look for yourself: Linux Format

The Future is what We all make IT (GPLv3)

## Ah yes, 'tis a deep area this

)

Ah yes, 'tis a deep area this quantum pit! :-)

I sort of thought it was meant in the sense of ( anti- ) aliasing for graphics display, where you try to fit a continuous curve to a discrete grid. I recall a long time ago doing assembler code/algorithms for fast curve drawing, and how the key nexus was how to decide which direction to jump to the next pixel. If one continually wipes & re-draws as the endpoints are progressively translated across the 2D canvas, then the curve gets this curious 'wiggly' quality as it wafts across the hidden 'boundaries' in the algorithm. So rounding issues are declared for instance, if exactly midway then do you round up or down, etc ..... glitches in the Matrix eh? :-)

For the wave/particle duality the hangup isn't particularly about the margins of the 'entity'. It's that certain circumstances can really confuse about extents. You can have the distal screen of a Young's double slit arrangement replaced by a photo-multiplier ( PM ), say, with some narrow 'throat'. You expect, a priori, that when it clicks off ( slow/low intensity, ~ one photon in the gadget at a time ) then that instance which we interpret as a single photon reception more or less confines our thinking as the photon at detection having dimensions of the order of that throat. So you waltz the PM tube up and down the target plane and accumulate counts, all the time only getting single or no clicks. All clicks the same. Now you can compare counts with one or other slit covered - the separation between them now many times the PM 'width' - yielding significant differences in the pattern. So one assumes that the photon prior to detection seems to have dimensions of the order of the slit separation. To really blow you over, some parts of the pattern experience an increase in flux/counts when one of the slits is closed! What The What? :-)

You can, as alluded to, jazz up the circumstances by putting in 'path detectors' of some sort at each slit and try to map the movements. These generally won't catch all passer's by, and interestingly one can then divide the statistics into two groups. The first are those where a target plane PM is activated without anything at a slit - this subgroup will show a typical 'wave' interference pattern. The second is where a target plane event coincides ( very nearly ) with one of the slit events. Overall these will exhibit a smeared pattern without interference effects. If you then divide this second group into those coincidences with one slit vs the other they show identical, but displaced patterns which when added give that overall smearing. That is discrete independent counts, as opposed to the first ( interfering ) case which you clearly need some 'phasing' related cross term, with possible subtraction, to get the counts to relate.

If you try to 'soften' the path detectors' bumping effects, then you can, and even return to an interference type pattern. But you can only do that in such a way, say by increasing some wavelength, with the effect that you lose sufficient resolution to determine unequivocally which slit was passed through.

So the upshot is that you can't get path information in those circumstances where interference occurs. You only ever receive discrete lumps! The duality is not wave or particle its really neither ..... :-) [ We associate paths with particles, and interference with waves. ]

There's no requirement for coherence of either photons or electrons except that they enter with pretty close energies/momenta/wavelengths so that interference effects are not overlapped/blurred ( 'monochromatic' ). There is no requirement for any phase relationship between successive or concurrent particles in the slit setup. That's the point - each particle seems to interfere with itself. This has been formalised as some sort of 'amplitude', or 'matter wave' etc, which takes all the paths and recombines with phase information at the point of detection ( the Feynman path/history integral* ).

Note all those photons passing through, some with target/slit coincidence and some without, were identically prepared. Some perchance got slit/path detection, some didn't. While travelling from the source/entry to the slit plane there was no difference that could be known/used to decide which group any given particle would be in. There is, with proper alignment, no difference in the expectation of which slit/path would it be detected at ( if at all ).

Cheers, Mike.

( edit ) * - the integration is over all possible/hypothetical spacetime paths, envisaged as a 'simultaneous' transit of some undetectable complex ( number field ) wave function. You add the contributions to get a complex number, the absolute value of which ( with a normalised denominator over the whole set of outcomes ) gives the probability of a particular outcome of interest. That addition effectively computes/compares the phase delays in the problem and hence results in enhanced or diminished values ( interference ). The absolute value of a complex number can be computed by multiplying it ( using complex number arithmetic rules ) by it's complex conjugate ( same real part, flip sign of imaginary part ). So that mathematically embodies the problem. If you path detect at some point ( near some slit ) then you, by definition, confine/reduce your path/history set to combine/integrate with. The total now excludes components with ( radically ) different phases, from the other slit, that would have appreciably subtracted. So no 'destructive interference' statistics, and ~ classical behaviour/predictions result.

I have made this letter longer than usual because I lack the time to make it shorter. Blaise Pascal

## Just to extend the discussion

)

Just to extend the discussion mildy .....

Take neutrons from a given source, say a reactor. Various energies/momenta/wavelengths. Have them head towards a lattice of some ( non-absorbing ) material that has a path/channel through it with a suitably different lattice spacing to the surrounding volume. If you make that pathway long enough then short-wavelength ( as per de Broglie's formula ) neutrons will not emerge from the other end of that path. The long wavelength ones will.

It is as if the neutrons only 'see' gaps/slits/arrays with spacing only with their wavelength or larger. They will ignore any grating gap smaller than their own wavelength, so for a given lattice/step distance : neutrons with a wavelength shorter than that will be diffracted/reflected etc and those with a longer wavelength won't. And yet from the distal end you can only ever detect single/whole neutrons!

Classically energetic neutrons ought be more-or-less as successful in transiting that channel as the sluggish ones ...... either way they'd bounce/diffuse through like balls in a pinball machine bouncing off the bumpers/pegs. However it actually acts like a low-pass ( frequency ) filter, more energy gives less penetration!. :-)

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter. Blaise Pascal

## RE: Just to extend the

)

There's still a few red-hot chillies in there to chew through yet!...

Stepping back to the slits: For the photons example for self-interference, that suggests that the requirement is not for coherent light but instead just for monochromatic light. Is that the case?

At what point does the interference fail as the two slits are placed further apart?

I would expect a similar effect with the Young's slits as you make the material for the slits thicker (greater depth for the slits).

An interesting question is:

Do you get a constant for "separation + depth" for the frequency cut off?...

Regards,

Martin

Take a look for yourself: Linux Format

The Future is what We all make IT (GPLv3)