Hmm... would there necessarily have to be more than one? Not that there couldn't be many... I'm wondering how much (energy) would need to be in a single point before a stability as a fundamental particle is reached, and what's the nature of the stability that a proton should have more mass (energy) than an electron... or doesn't that arise from the nature of integer (or half integer) spin? I recall reading that a particle composed of 5 quarks was theorized, and has subsequently been observed... So can enough energy be in one place at the same time to form an event horizon? It can knot... :) Or is it not proper to think of a 'loop' as some kind of event horizon?

The following I've lifted and adapted from the book Black Holes by Dr. Jean-Pierre Luminet, Chapter Nine 'The Far Horizon', page 125.

Quote:

Table 3: The gravitational parameter ( Rg/R ) of ordinary bodies

Object Mass Size R Schwarzschild radius Rg Rg/R

Atom 10e-26 kg 10e-8 cm 10e-51 cm 10e-43
Human 100 kg 1 m 10e-23 cm 10e-25
Mountain 10e+12 kg 1 km 10e-13 cm 10e-18
Earth 10e+25 kg 10e+4 km 1cm 10e-9
Sun 10e+30 kg = 1Mo 10e+6 km 1km 10e-6
White dwarf 1 Mo 10 km 1 km 10e-4
Neutron star 1 Mo 10 km 1 km 10e-1
Galaxy 10e+11 Mo 10 l.y. 10e-2 l.y. 10e-7
Universe 10e+23 Mo 10e+10 l.y. 10e+10 l.y. 1?
(if closed)

Note: The gravitational parameter is the ratio between the Schwarzschild radius â€” which depends only on the mass â€” and its real size. In other words, it measures the â€˜compactnessâ€™ of a body; the closer its parameter is to one, the closer a body is to the black hole state. The numerical values in the table are given to the nearest power of ten. The parameters for the Universe require careful consideration: see Chapter 19.

Here 'l.y' is light year, Mo is the mass of the Sun, 'e' indicates exponent with sign following. It's a good ready reckoner for the rough estimate of black hole size, or the radius of the event horizon if you like, and how far a given object is from that state. It's based on the Schwarzchild solution in General Relativity, using R = 2MG/c*c ( M is central mass, G is Newton's gravitational constant, and c is speed of light ). Thus a human would have to either acquire 10e+25 more mass ( at the same size ) or become 10e-25 times smaller ( with the same mass ), or some combination of such changes, in order to form an event horizon.
For your question, this assumes that any of this theory actually applies at the scale of interest. General relativity is classical, the Planck length at 10e-33 cm is quoted as being the smallest scale above which space-time geometry can be considered smooth. So how does a pack of quarks, or anything else, behave at or below that scale? Where do we stop looking and accept some effective theory above a certain size? Fascinating....... :-)

(edit) Darn, the table didn't come out well... :-(

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

The following I've lifted and adapted from the book Black Holes by Dr. Jean-Pierre Luminet, Chapter Nine 'The Far Horizon', page 125.

[pre]Table 3: The gravitational parameter ( Rg/R ) of ordinary bodies

Object Mass Size R Schwarzschild radius Rg Rg/R
Atom 10e-26 kg 10e-8 cm 10e-51 cm 10e-43
Human 100 kg 1 m 10e-23 cm 10e-25
Mountain 10e+12 kg 1 km 10e-13 cm 10e-18
Earth 10e+25 kg 10e+4 km 1cm 10e-9
Sun 10e+30 kg = 1Mo 10e+6 km 1km 10e-6
White dwarf 1 Mo 10 km 1 km 10e-4
Neutron star 1 Mo 10 km 1 km 10e-1
Galaxy 10e+11 Mo 10 l.y. 10e-2 l.y. 10e-7
Universe 10e+23 Mo 10e+10 l.y. 10e+10 l.y. 1?
(if closed)
[/pre]

Quote:

(edit) Darn, the table didn't come out well... :-(

Beautiful and thank you! :-)
Hey, what's your reference for BBCode? What's the 'pre' that you used? I thought of a jpeg image, 'cos I can scan stuff, but you need to point a constant and public internet url for others to access it. ( I haven't my own web page to date, though my ISP will give me room for one ).

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Just the "Use BBCode" link that appears beside the posting form on any of the BOINC message boards.

Quote:

What's the 'pre' that you used?

"Pre" is short for "preformatted", I believe; the HTML tag is used mainly to preserve the 'white space' characters (word-spaces and carriage-returns) that are normally ignored by browsers, e.g. to show the indents and line-breaks in samples of code. Here, though, I was taking advantage of the fact that "pre" text usually displays in a monospaced font (e.g. Courier), so a given number of characters (including spaces) always comes out with the same line lengthâ€”which is what you need to make tabular columns line up.

Quote:

I thought of a jpeg image, 'cos I can scan stuff, ...

JPEGs are a poor choice for rendering text, anyway; the GIF format is much better for 'crisp' images that use a limited number of flat colours. The JPEG (Joint Photographic Experts Group) format was designed for compressing individual frames from a movie; it was never originally intended for images that might receive more than a sixteenth of a second of scrutiny! It does work pretty well on photographic images, though, especially with low to moderate compression, but the characteristic artifacts can be quite obvious where the content is very high in contrastâ€”as in text, line diagrams, and most logotypes.

So how does a pack of quarks, or anything else, behave at or below that scale?

I'm guessing like bits of discombobulated spacetime, with slightly differing asymmetries in n dimensions... So what's the volume of a photon? Because, would the Casimir effect still apply at these scales? (Happy to see the zeta function in physics again...)

Quote:

Where do we stop looking and accept some effective theory above a certain size?

I think where the abstract nature of maths resolves with the abstract nature of, well, nature... :)

Well, I've been reading about photons, phonons, and excitons. I think I should have asked about radiation pressure on quarks, which would be like loops or knots, charged and hence conducting, like ideal metal plates... but the physical size (volume?) of a photon necessarily relates to Planck's constant, and a size larger than quarks, right? As though a smaller particle has to move with a minimum quantized amount of energy or it's as if nothing happened? Is a quark smaller than a photon?

Anyway, when I also saw the mention of mathematical 'regulators' used to solve the problem with infinity regarding calculation of the vacuum energy (summing energies from all possible oscillators at all points in space) in different physical situations, I thought that these 'regulators' will arise from things like geometries of manifolds (if I'm using the terms correctly) at the most fundamental (or abstract?) scales... does this make sense?

Well, I've been reading about photons, phonons, and excitons. I think I should have asked about radiation pressure on quarks, which would be like loops or knots, charged and hence conducting, like ideal metal plates... but the physical size (volume?) of a photon necessarily relates to Planck's constant, and a size larger than quarks, right? As though a smaller particle has to move with a minimum quantized amount of energy or it's as if nothing happened? Is a quark smaller than a photon?

Hmmmm ...... well I don't think the photons have a measurable or calculable size, by fiat really. They mediate the EM reactions, or couple if you like, between charged particles. Being bosons you can jam as many together as you like and they won't mind. Anyhows at a given energy ( which frequency goes proportional to ) and momentum ( which wavelength goes inversely proportional to ), both linked by E = pc, then you can only 'probe' lengths pretty well above about a wavelength.

Quote:

Anyway, when I also saw the mention of mathematical 'regulators' used to solve the problem with infinity regarding calculation of the vacuum energy (summing energies from all possible oscillators at all points in space) in different physical situations, I thought that these 'regulators' will arise from things like geometries of manifolds (if I'm using the terms correctly) at the most fundamental (or abstract?) scales... does this make sense?

I haven't really much of a clue here about 'regulators'. :-( But I understand that the vacuum energy is essentially the summation of contributions from virtual particles - by definition we don't detect these directly. I think the model is where a particle has 'bare' charge for some force ( say EM ) that we don't see unless we are real close, and this is surrounded by the virtual force carriers ( say photons ). At a large distance you would see, say, the inverse square law of attraction between two ( charged ) particles with some constant of proportionality. However if you move closer in you penetrate the virtual particle 'cloud' and then one feels relatively more of the 'bare' force charge and relatively less of the cloud. Although the force will still increase as you approach through this cloud, it will increase more than the proportion that would apply if you went by the behaviour at a distance. The 'bare' charge becomes 'unmasked'. Pretty well all force modes ( EM, weak, strong, gravity ) are hypothecated to operate in this manner, with this type of model. Each force will have it's own carrier ( photon, W and Z vector bosons, gluons, gravitons ), and if a particle has the relevant charge ( electric, weak force charge, strong force charge, mass ) for whichever force then this model applies.
So to get close and 'feel' out the virtual cloud requires progressively higher energies, higher frequencies for those carriers that we associate with one, and shorter wavelengths. This is the origin of the idea that at high enough energies, the relative strengths of the various forces approach each other as their respective virtual clouds are penetrated. So if I was really, really, really ....... really, really close to an electron then the force strengths of all the forces that it is subject to ( for which it possesses a 'charge' ) will be much the same! Imagine that! Gravity not being 10 to the minus 40 something times as pissweak as electromagnetism! So very, very, very, very ..... very, very early on in the universe when all was really ( to some power ) hot and energetic, the forces were pretty equivalent. As the Big Bang cools that force dependence with distance comes into play as particles get out of each other's virtual carrier clouds. At certain temperatures on the way down the 'symmetry' between the forces break - meaning that the force relationships with distance become not equivalent. A long time later ( now ) we have this 'heirarchy problem' to explain - why are the known forces quite radically different in strength?
Now I suppose one can have virtual particles about, popping in and out of existence in pairs, using the Uncertainty Principle. Basically if, at a given energy to produce a pair, they don't exist for too long ( less for more energy ) then you'd never be able to measure them. Thanks Werner! The Casimir effect 'works' because in the space ( near the centre ) between the conducting plates ( bigger is better ) at a given distance you can exclude a number of the energy modes of virtual particles. The wavelengths won't fit. But those modes are not excluded outside the plates and hence either plate will feel an aysmmetry in the pressure of those particles - the plates attract. It's not a big effect but it has been measured.
Furthermore, if you're still reading, supersymmetry, if present, neatly handles some 'vacuum' infinities, as it hypothecates 'superpartners' to our known particles. For each known fermion there is a corresponding boson. For each known boson there is a corresponding fermion. The short answer is that if you have available both fermionic and bosonic states to contribute to vacuum energy ( aka. virtual particles ) then any contribution from the fermionic components are cancelled by the bosonic ones. Well.....not quite cancel, as we want some non-zero result. The trouble is that calculations suggest cancellation of two humungous sums to a value which is quite vanishing by comparison with either. Sort of like subtracting Rupert Murdoch's nett worth from Bill Gate's and winding up with mine! Should such an adjustment be plausible? Who knows?

Aside: Recall that fermions when combining quantum mechanical amplitudes do so with a negative sign. Bosons do so with a positive sign. This is why fermion amplitudes ( and thus probabilies ) tend to zero if trying 'meld' final states, while the boson amplitudes ( and thus probabilities ) tend to non-zero if trying to 'meld' final states. Hence fermions don't meld leading to the Pauli exclusion rule and the stability of matter. Bosons certainly do leading to condensates, lasers etc.

A bit longwinded, as usual, but I hope I've helped. :-)

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

But the force of attraction between the metal plates (Casimir effect) is inversely proportional to the 4th power of the the distance between them, and I read that if the plates are allowed to touch then they can't be pulled apart without destroying them. Maybe a bad question, but could the strong force be some aspect of the EM force at smaller (more fundamental/abstract) scales?

Okay, so light exerts a pressure, it requires no (zero) specific volume of space, and because of superposition, an infinite number of photons can exist in a single point in space... When we measure vacuum fluctuations, might this just be the summation of all the light that happens to be there? What happens to the photons when the wavelength becomes too long to be absorbed by anything? Is there such a point reached in the evolution of a photon? Can it be that such photons, having almost zero energy, existing in whatever amount produced by the Big Bang and increasing in number as photons are continually absorbed at higher energies and re-emitted at lower energies; can it be that these photons too weak (or long in wavelength) to be absorbed, still sum with each other, in uncountably large (nearly infinite) quantities, to produce not only quantum vacuum fluctuations, but also is the force currently referred to as Dark Energy (i.e., photons with wavelengths longer than galactic clusters)? I've heard there was something called 'Tired Light' and that it probably wasn't a good hypothesis; I need to check...

But the force of attraction between the metal plates (Casimir effect) is inversely proportional to the 4th power of the the distance between them, and I read that if the plates are allowed to touch then they can't be pulled apart without destroying them.

Hadn't heard that, but I guess it implies 'infinite' force at zero distance ( when touching ).

Quote:

Maybe a bad question, but could the strong force be some aspect of the EM force at smaller (more fundamental/abstract) scales?

Top question, actually. :-) The 'unified field' theory, in whatever clothing, has been pursued for some time. The idea is that each force, that we label as separate, is an aspect of that field below a given energy ( hence above a given distance ).

Quote:

Okay, so light exerts a pressure, it requires no (zero) specific volume of space, and because of superposition, an infinite number of photons can exist in a single point in space... When we measure vacuum fluctuations, might this just be the summation of all the light that happens to be there?

Ahhh, but we don't measure the vacuum fluctuations directly, only the difference in vacuum states on either side of a Casimir plate. So where is the 'baseline' or 'zero' state? Anywhere you like - as long as the differences predict our measurements.

Quote:

What happens to the photons when the wavelength becomes too long to be absorbed by anything?

Like the Energizer Bunny, they keep going.

Quote:

Is there such a point reached in the evolution of a photon?

I suppose so, hence the microwave background and beyond to lower frequencies.

Quote:

Can it be that such photons, having almost zero energy, existing in whatever amount produced by the Big Bang and increasing in number as photons are continually absorbed at higher energies and re-emitted at lower energies; can it be that these photons too weak (or long in wavelength) to be absorbed, still sum with each other, in uncountably large (nearly infinite) quantities, to produce not only quantum vacuum fluctuations, but also is the force currently referred to as Dark Energy (i.e., photons with wavelengths longer than galactic clusters)? I've heard there was something called 'Tired Light' and that it probably wasn't a good hypothesis; I need to check...

Yeah but if they're not coupling with anything much then what's the mechanism of influence? If it's simply energy density then you can curve space with it, but it's been doing that since earlier times anyway - now divided into a lot more but smaller packets - since energy is conserved. I personally reckon the centre of Dark Energy is at the Australian Tax Office, and the interior of our politicians skulls have a lot of vacuum energy, but opinions differ.... :-)

More generally, there is an idea, originally called the "Correspondence Principle" I think, which is important here. Any theory has it's limits of applicability, meaning there will be some domain of certain variables ( like energy or distance or mass ) for which it is assumed to apply. The principle dictates that any two theories with different but partially overlapping domains should agree within the intersection of said domains. Fair enough! So Newtonian physics is the low speed, low mass, everyday dimensions version of Special Relativity, General Relativity, and Quantum Mechanics respectively. So a 'classical' theory becomes the restricted ( and generally easier ) approximation to our modern refinements. What is more subtler is whether a newer theory that defines behaviour in an extended domain ( say really small distances ), not only predicts phenomena currently known in some overlapping and better known domain ( say macroscopic sizes ), but doesn't predict phenomena which definitely don't occur! I understand that particle physics is plagued with theory variants that successfully explain some behaviour of experimental interest, but introduce stuff that has never been seen in experiments thus far ( and would have been seen if the stated effect was present ). One doesn't need an embarassment of riches here! The challenge is to explain small scale phenomena adequately and remain consistent with established results as well. This is implicitly done with just about any measurement anyway. 'Atoms' and 'photons' and others are ideas which we believe to be 'true' by a chain of reasoning and effects trailing from small scale up to large. This is because our immediate senses need augmenting, but therein lies the challenge of interpretation of the 'meaning' of results. All results are ultimately macroscopic. Cheers.. :-)

(edit) What is doubly fascinating for me is the link between the very small and the very large. Particle physics and cosmology are quite intertwined!

(edit) It occurs to me that 50 - 60 Hz photons are getting produced everday by our electricity grids. I'd seriously doubt if they are modelled using Quantum Electrodynamics though. I wouldn't hire any electrician that did. There would be smarter ways of analysing household electrics than QED, even though it is more 'exact'!

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## RE: Hmm... would there

)

The following I've lifted and adapted from the book Black Holes by Dr. Jean-Pierre Luminet, Chapter Nine 'The Far Horizon', page 125.

Here 'l.y' is light year, Mo is the mass of the Sun, 'e' indicates exponent with sign following. It's a good ready reckoner for the rough estimate of black hole size, or the radius of the event horizon if you like, and how far a given object is from that state. It's based on the Schwarzchild solution in General Relativity, using R = 2MG/c*c ( M is central mass, G is Newton's gravitational constant, and c is speed of light ). Thus a human would have to either acquire 10e+25 more mass ( at the same size ) or become 10e-25 times smaller ( with the same mass ), or some combination of such changes, in order to form an event horizon.

For your question, this assumes that any of this theory actually applies at the scale of interest. General relativity is classical, the Planck length at 10e-33 cm is quoted as being the smallest scale above which space-time geometry can be considered smooth. So how does a pack of quarks, or anything else, behave at or below that scale? Where do we stop looking and accept some effective theory above a certain size? Fascinating....... :-)

(edit) Darn, the table didn't come out well... :-(

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## RE: The following I've

)

[pre]Table 3: The gravitational parameter ( Rg/R ) of ordinary bodies

Object Mass Size R Schwarzschild radius Rg Rg/R

Atom 10e-26 kg 10e-8 cm 10e-51 cm 10e-43

Human 100 kg 1 m 10e-23 cm 10e-25

Mountain 10e+12 kg 1 km 10e-13 cm 10e-18

Earth 10e+25 kg 10e+4 km 1cm 10e-9

Sun 10e+30 kg = 1Mo 10e+6 km 1km 10e-6

White dwarf 1 Mo 10 km 1 km 10e-4

Neutron star 1 Mo 10 km 1 km 10e-1

Galaxy 10e+11 Mo 10 l.y. 10e-2 l.y. 10e-7

Universe 10e+23 Mo 10e+10 l.y. 10e+10 l.y. 1?

(if closed)

[/pre]

Is this any better?

## RE: Is this any

)

Beautiful and thank you! :-)

Hey, what's your reference for BBCode? What's the 'pre' that you used? I thought of a jpeg image, 'cos I can scan stuff, but you need to point a constant and public internet url for others to access it. ( I haven't my own web page to date, though my ISP will give me room for one ).

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## RE: Hey, what's your

)

Just the "Use BBCode" link that appears beside the posting form on any of the BOINC message boards.

"Pre" is short for "preformatted", I believe; the HTML tag is used mainly to preserve the 'white space' characters (word-spaces and carriage-returns) that are normally ignored by browsers, e.g. to show the indents and line-breaks in samples of code. Here, though, I was taking advantage of the fact that "pre" text usually displays in a monospaced font (e.g. Courier), so a given number of characters (including spaces) always comes out with the same line lengthâ€”which is what you need to make tabular columns line up.

JPEGs are a poor choice for rendering text, anyway; the GIF format is much better for 'crisp' images that use a limited number of flat colours. The JPEG (Joint Photographic Experts Group) format was designed for compressing individual frames from a movie; it was never originally intended for images that might receive more than a sixteenth of a second of scrutiny! It does work pretty well on photographic images, though, especially with low to moderate compression, but the characteristic artifacts can be quite obvious where the content is very high in contrastâ€”as in text, line diagrams, and most logotypes.

## RE: Just the "Use BBCode"

)

Of course, thank you ... blush.... :-)

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## RE: So how does a pack of

)

I'm guessing like bits of discombobulated spacetime, with slightly differing asymmetries in n dimensions... So what's the volume of a photon? Because, would the Casimir effect still apply at these scales? (Happy to see the zeta function in physics again...)

I think where the abstract nature of maths resolves with the abstract nature of, well, nature... :)

-edit- nice job on the table!

## Well, I've been reading about

)

Well, I've been reading about photons, phonons, and excitons. I think I should have asked about radiation pressure on quarks, which would be like loops or knots, charged and hence conducting, like ideal metal plates... but the physical size (volume?) of a photon necessarily relates to Planck's constant, and a size larger than quarks, right? As though a smaller particle has to move with a minimum quantized amount of energy or it's as if nothing happened? Is a quark smaller than a photon?

Anyway, when I also saw the mention of mathematical 'regulators' used to solve the problem with infinity regarding calculation of the vacuum energy (summing energies from all possible oscillators at all points in space) in different physical situations, I thought that these 'regulators' will arise from things like geometries of manifolds (if I'm using the terms correctly) at the most fundamental (or abstract?) scales... does this make sense?

## RE: Well, I've been reading

)

Hmmmm ...... well I don't think the photons have a measurable or calculable size, by fiat really. They mediate the EM reactions, or couple if you like, between charged particles. Being bosons you can jam as many together as you like and they won't mind. Anyhows at a given energy ( which frequency goes proportional to ) and momentum ( which wavelength goes inversely proportional to ), both linked by E = pc, then you can only 'probe' lengths pretty well above about a wavelength.

I haven't really much of a clue here about 'regulators'. :-( But I understand that the vacuum energy is essentially the summation of contributions from virtual particles - by definition we don't detect these directly. I think the model is where a particle has 'bare' charge for some force ( say EM ) that we don't see unless we are real close, and this is surrounded by the virtual force carriers ( say photons ). At a large distance you would see, say, the inverse square law of attraction between two ( charged ) particles with some constant of proportionality. However if you move closer in you penetrate the virtual particle 'cloud' and then one feels relatively more of the 'bare' force charge and relatively less of the cloud. Although the force will still increase as you approach through this cloud, it will increase more than the proportion that would apply if you went by the behaviour at a distance. The 'bare' charge becomes 'unmasked'. Pretty well all force modes ( EM, weak, strong, gravity ) are hypothecated to operate in this manner, with this type of model. Each force will have it's own carrier ( photon, W and Z vector bosons, gluons, gravitons ), and if a particle has the relevant charge ( electric, weak force charge, strong force charge, mass ) for whichever force then this model applies.

So to get close and 'feel' out the virtual cloud requires progressively higher energies, higher frequencies for those carriers that we associate with one, and shorter wavelengths. This is the origin of the idea that at high enough energies, the relative strengths of the various forces approach each other as their respective virtual clouds are penetrated. So if I was really, really, really ....... really, really close to an electron then the force strengths of all the forces that it is subject to ( for which it possesses a 'charge' ) will be much the same! Imagine that! Gravity not being 10 to the minus 40 something times as pissweak as electromagnetism! So very, very, very, very ..... very, very early on in the universe when all was really ( to some power ) hot and energetic, the forces were pretty equivalent. As the Big Bang cools that force dependence with distance comes into play as particles get out of each other's virtual carrier clouds. At certain temperatures on the way down the 'symmetry' between the forces break - meaning that the force relationships with distance become not equivalent. A long time later ( now ) we have this 'heirarchy problem' to explain - why are the known forces quite radically different in strength?

Now I suppose one can have virtual particles about, popping in and out of existence in pairs, using the Uncertainty Principle. Basically if, at a given energy to produce a pair, they don't exist for too long ( less for more energy ) then you'd never be able to measure them. Thanks Werner! The Casimir effect 'works' because in the space ( near the centre ) between the conducting plates ( bigger is better ) at a given distance you can exclude a number of the energy modes of virtual particles. The wavelengths won't fit. But those modes are not excluded outside the plates and hence either plate will feel an aysmmetry in the pressure of those particles - the plates attract. It's not a big effect but it has been measured.

Furthermore, if you're still reading, supersymmetry, if present, neatly handles some 'vacuum' infinities, as it hypothecates 'superpartners' to our known particles. For each known fermion there is a corresponding boson. For each known boson there is a corresponding fermion. The short answer is that if you have available both fermionic and bosonic states to contribute to vacuum energy ( aka. virtual particles ) then any contribution from the fermionic components are cancelled by the bosonic ones. Well.....not quite cancel, as we want some non-zero result. The trouble is that calculations suggest cancellation of two humungous sums to a value which is quite vanishing by comparison with either. Sort of like subtracting Rupert Murdoch's nett worth from Bill Gate's and winding up with mine! Should such an adjustment be plausible? Who knows?

Aside: Recall that fermions when combining quantum mechanical amplitudes do so with a negative sign. Bosons do so with a positive sign. This is why fermion amplitudes ( and thus probabilies ) tend to zero if trying 'meld' final states, while the boson amplitudes ( and thus probabilities ) tend to non-zero if trying to 'meld' final states. Hence fermions don't meld leading to the Pauli exclusion rule and the stability of matter. Bosons certainly do leading to condensates, lasers etc.

A bit longwinded, as usual, but I hope I've helped. :-)

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## Longer is fine. :) Thanks

)

Longer is fine. :) Thanks Mike!

But the force of attraction between the metal plates (Casimir effect) is inversely proportional to the 4th power of the the distance between them, and I read that if the plates are allowed to touch then they can't be pulled apart without destroying them. Maybe a bad question, but could the strong force be some aspect of the EM force at smaller (more fundamental/abstract) scales?

Okay, so light exerts a pressure, it requires no (zero) specific volume of space, and because of superposition, an infinite number of photons can exist in a single point in space... When we measure vacuum fluctuations, might this just be the summation of all the light that happens to be there? What happens to the photons when the wavelength becomes too long to be absorbed by anything? Is there such a point reached in the evolution of a photon? Can it be that such photons, having almost zero energy, existing in whatever amount produced by the Big Bang and increasing in number as photons are continually absorbed at higher energies and re-emitted at lower energies; can it be that these photons too weak (or long in wavelength) to be absorbed, still sum with each other, in uncountably large (nearly infinite) quantities, to produce not only quantum vacuum fluctuations, but also is the force currently referred to as Dark Energy (i.e., photons with wavelengths longer than galactic clusters)? I've heard there was something called 'Tired Light' and that it probably wasn't a good hypothesis; I need to check...

## RE: But the force of

)

Hadn't heard that, but I guess it implies 'infinite' force at zero distance ( when touching ).

Top question, actually. :-) The 'unified field' theory, in whatever clothing, has been pursued for some time. The idea is that each force, that we label as separate, is an aspect of that field below a given energy ( hence above a given distance ).

Ahhh, but we don't measure the vacuum fluctuations directly, only the difference in vacuum states on either side of a Casimir plate. So where is the 'baseline' or 'zero' state? Anywhere you like - as long as the differences predict our measurements.

Like the Energizer Bunny, they keep going.

I suppose so, hence the microwave background and beyond to lower frequencies.

Yeah but if they're not coupling with anything much then what's the mechanism of influence? If it's simply energy density then you can curve space with it, but it's been doing that since earlier times anyway - now divided into a lot more but smaller packets - since energy is conserved. I personally reckon the centre of Dark Energy is at the Australian Tax Office, and the interior of our politicians skulls have a lot of vacuum energy, but opinions differ.... :-)

More generally, there is an idea, originally called the "Correspondence Principle" I think, which is important here. Any theory has it's limits of applicability, meaning there will be some domain of certain variables ( like energy or distance or mass ) for which it is assumed to apply. The principle dictates that any two theories with different but partially overlapping domains should agree within the intersection of said domains. Fair enough! So Newtonian physics is the low speed, low mass, everyday dimensions version of Special Relativity, General Relativity, and Quantum Mechanics respectively. So a 'classical' theory becomes the restricted ( and generally easier ) approximation to our modern refinements. What is more subtler is whether a newer theory that defines behaviour in an extended domain ( say really small distances ), not only predicts phenomena currently known in some overlapping and better known domain ( say macroscopic sizes ), but doesn't predict phenomena which definitely don't occur! I understand that particle physics is plagued with theory variants that successfully explain some behaviour of experimental interest, but introduce stuff that has never been seen in experiments thus far ( and would have been seen if the stated effect was present ). One doesn't need an embarassment of riches here! The challenge is to explain small scale phenomena adequately and remain consistent with established results as well. This is implicitly done with just about any measurement anyway. 'Atoms' and 'photons' and others are ideas which we believe to be 'true' by a chain of reasoning and effects trailing from small scale up to large. This is because our immediate senses need augmenting, but therein lies the challenge of interpretation of the 'meaning' of results. All results are ultimately macroscopic. Cheers.. :-)

(edit) What is doubly fascinating for me is the link between the very small and the very large. Particle physics and cosmology are quite intertwined!

(edit) It occurs to me that 50 - 60 Hz photons are getting produced everday by our electricity grids. I'd seriously doubt if they are modelled using Quantum Electrodynamics though. I wouldn't hire any electrician that did. There would be smarter ways of analysing household electrics than QED, even though it is more 'exact'!

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal