The real joke of "Why did the chicken cross the road?" isn't the specific retorts. It's the assumption that reasoning/intent applies at all. Suppose a chicken has neither a 'road' concept nor that of 'sides'. A road shaped pattern in the visual field doesn't trigger neurology that has no need for it *.

Cheers, Mike.

( edit ) * - until recently that is. :-) :-)

Interesting.. Is it possible to understand and view the universe rationally without "concept". I think the chicken does it quite well. and they get along amazing well together:-)

There are some who can live without wild things and some who cannot. - Aldo Leopold

I remember playing the English Suites by JSB. Then I played an Olivetti Lettera 22 typewriter when I translated 12 books in order to pay for my mortgage and support my children. Now I only play a keyboard. Sob...
Tullio

Now here's an interesting relevant bit o' jommetry I hadn't fully realised the import of before. The vector cross product makes sense in 3 dimensions only. For four or more, one can't uniquely assign a result by generalising the 3-D version. The cross product construct is involved in the Lorentz force law with electromagnetism say.

In 3-D the right ( or left ) hand rule lets you have a single direction which is perpendicular to the other two vector arguments. In 4-D there is an infinitude of directions which are orthogonal to any two given ( non-parallel ) vectors. So ..... if there are more independent spatial directions than are evident to us in this Universe we'd need some extra rules applying to explain why cross-product-dependent behaviours occur as seen by us. Or : what general rule applies in higher dimensions that specialises to the cross product in 3-D?

You see GR doesn't tell us what happens with other forces. Indeed one can use the difference between the world lines of a test particle with an electric charge compared to one without, in order to deduce that a non-gravitational force is acting at all ( preventing 'free fall' ).

Any/all of this may be solved already. Just musing ..... :-)

Cheers, Mike.

( edit ) Well, there's a trivial sense in which the above must hold. Extra dimensions by definition give one 'more room to play with'. I'm sorta getting at why, when GR can apply to very many dimensions ( at least I understand that to be the case ), the non-gravity forces are as they are. One for the 'Theory Of Everything' I guess ....

( edit ) Or I'm a chicken who is totally missing the relevant road and it's sides. :-)

Interestingly 'concept' is a concept in and of itself. It's a way of 'placing brackets', like the curly ones used for set definition :

{1, 2, 3, 4, 5 ..... }

The objects inside have a separate 'existence' but the act of bracketing them, thus grouping, is a concept. But I'll stop there. I feel madness lies along that path - all men are liars, barbers shaving themselves etc. Thank you Bertrand Russell.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

From what I remember, in a Hilbert space you have an inner product and an outer product. The first in a 3 space is the scalar product and the second is the vector product, which you call the cross product. A Hilbert space is a Banach space with an inner product. A Banach space is a linear normed space. All this from memory, so pardon any error.
Tullio
Edit: the linear vector space must also be "complete" to be a Banach space?

Regarding the question about the about the antimatter, there's some interesting news from several of Fermilab's ongoing experiments, namely MINOS, MinniBooNe, and D0 - -

At D0 it looks like they've identified a CP violation (charge-parity) with neutral B mesons â€“ there appears to be an asymmetry between the way the B and anti-B mesons decay that deviates from the Standard Model prediction by 3.2 standard deviations.

At MINOS they measured the 'square of the difference between mass eigenstates' between muon neutrinos and tau neutrinos and came up with 2.35 x 10^-3 eV^2. A particle and its antiparticle should both have the same mass, right? When they measured the same difference between muon and tau anti-neutrinos they got 3.35 x 10^-3 eV^2. The statistical confidence is '2 sigma' â€“ for the particle physicists, 3 or 4 sigma is pretty close to certainty, and in this case would mean some serious head-scratching. The scientists had to fire 7 x 10^20 protons at a target to gather the data for the muon/tau neutrino measurement, so it will probably be some time in early 2012 before recording enough data for 3 sigma confidence (assuming the effect is real).

At MiniBooNe the researchers have already weighed-in at 3 sigma confidence with their measurements of a similar mass difference for lower-energy muon neutrinos as they oscillate into electron neutrinos (compared to the muon and electron anti-neutrinos).

Wouldn't surprise me if spacetime turns out to be quantized and left or right handed ...

How do neutrinos get locked inside atoms in the first place, or are they created during the decay?

At D0 it looks like they've identified a CP violation (charge-parity) with neutral B mesons â€“ there appears to be an asymmetry between the way the B and anti-B mesons decay that deviates from the Standard Model prediction by 3.2 standard deviations.

That's just "significant" by the usual measures. With a number of assumptions regarding behaviour of large numbers then 3 sigma 'means' that it is more than ~ 99.7 % likely that the effect of interest is not due to sampling variation ( a 'lucky streak' ). Or, if you like, if we repeated the experiment say, 1000 times we could attribute the ( same ) results from 3 ( 1000 - 997 ) of them due to random luck in sampling. The other 997 ( of 1000 ) we could not blame random variance upon. I don't really know why traditionally 3 sigma is the 'magic' line to equate to 'significance' ( I mean it's a fine choice for sure, but I just don't know the history of that ). Not all state it that way though, they just quote the sigma value and let others make up their own mind as to what level they will accept. 1 sigma is ~ 68.2 % and 2 sigma is ~ 95.4 % in normally distributed ( Gaussian ) statistics. 4 sigma is out at 99.99 % and each extra sigma from there on adds about another two 9's.

The biggest killer of this analysis is a systematic bias in measurement ie. an effect ( in your measurement system ) skewing the results in some direction. This can occur with radar guns, say - so to account for this ( or more likely to defray challenges in court ) in my state one is only sent a violation notice if the measured speed exceeds 3% above the allowed.

Quote:

At MINOS they measured the 'square of the difference between mass eigenstates' between muon neutrinos and tau neutrinos and came up with 2.35 x 10^-3 eV^2. A particle and its antiparticle should both have the same mass, right? When they measured the same difference between muon and tau anti-neutrinos they got 3.35 x 10^-3 eV^2. The statistical confidence is '2 sigma' â€“ for the particle physicists, 3 or 4 sigma is pretty close to certainty, and in this case would mean some serious head-scratching. The scientists had to fire 7 x 10^20 protons at a target to gather the data for the muon/tau neutrino measurement, so it will probably be some time in early 2012 before recording enough data for 3 sigma confidence (assuming the effect is real).

The main current model is that a mass eigenstate is an un-observable quantum state that when mixed with others gives an actual detectable neutrino. The idea is that as these eigenstates propagate they wax and wane so that the detectable mix ( which is what determines observation as an electron, muon or tau neutrino ) varies as you go along - from the Sun ( source ) to the Earth ( target ) for instance. Even worse, 'cos of quantum, a given eigenstate mix at some particular position only gives probabilities for observing a given detectable type. To deduce these probabilities/fractions one has to collect sufficiently large numbers of data points. Thus some eigenstates will be more likely to give electron neutrinos say ( if they interact at all ! ) rather than either of the other two. Mathematically this can be conceptualised in a vector space containing eigenvectors which have angles between them that specify the exact mixing character. The first was named after Nicola Cabibbo who first saw this connection ( ie. where would physics be without Italians, eh Tullio ? ) but generically all three are usually called the Cabibbo Angles. Well it's three if you believe there are only three neutrino types, which is another story again .... :-)

Anyway the consequence, via this scheme, is that it is OK to have a difference b/w muon neutrino and tau neutrino HOWEVER that ought be the same difference as b/w muon anti-neutrino and tau anti-neutrino. Why would the eigenstates not be invariant here ?! So yeah, if this firms up to 3+ sigma level - the scientific analog/version of the criminal/legal proof standard 'beyond reasonable doubt' - then the feline will well and truly be among the avians!! :-)

Quote:

At MiniBooNe the researchers have already weighed-in at 3 sigma confidence with their measurements of a similar mass difference for lower-energy muon neutrinos as they oscillate into electron neutrinos (compared to the muon and electron anti-neutrinos).

So ditto but more so as their methodology is rather different. Looking 'real' then .... :-)

Quote:

Wouldn't surprise me if spacetime turns out to be quantized and left or right handed ...

It'll be subtleties like the above that will define that, if at all.

Quote:

How do neutrinos get locked inside atoms in the first place, or are they created during the decay?

Well, I think/remember a neutron as being "equal" to a proton plus an electron plus a neutrino. But that's a 'black box' answer, a rule of thumb, as we only know what we measure. Who knows what they get up to when we are not looking ? :-)

Cheers, Mike.

( edit ) So the percentages aren't a statement about % 'right' or 'wrong' but about the likelihood of a result being due to a random sampling, being taken from a large possible population ( here infinitely large ). Of course if we could measure/know every interaction in the universe for all time then we'd have no need for statistics : you'd just state what happened with that complete knowledge. Failing that we make some reasonable assumptions - basically that the universe is not perversely structured simply in order to fool us. :-)

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Thanks Mike, also in the name of Nicola Cabibbo who was unjustly denied a Nobel Prize last year. I have personally known two Italian physicists who deserved a Nobel Prize and never got it, experimentalist Giuseppe "Beppo" Occhalini (who was one of the very few Italian professors who refused to take a loyalty oath to Mussolini and exiled himself in Brazil where he earned his money by acting as a mountain guide) and theorist Tullio Regge, of Regge pole fame. But to earn a Nobel prize you must speak English.
Tullio

One other thought on the matter/anti-matter issue : might the quoted asymmetry be equivalently stated as the weak force having a different distance dependence according to matter/antimatter - because it is the vector bosons ( Z0, Z+ and Z- ) mediating this behaviour. This could be 'seen' in the micro as differing masses and decay modes? But in the 'macro' perhaps this gives rise to an apparent 'force' that is evident as 'cosmic expansion'? Hoyle and others had a similiar argument in the late 1950's referring to an ever so slight difference in the magnitude of the proton vs. electron charge. Here about a 10^(-18) difference would be sufficient to explain the then measured expansion rate ( Hubble factor ). The idea fell over though, as direct measurements discounted the charge asymmetry to much lower levels.

The point is that dark energy/matter arguments pre-suppose that our knowledge of forces is complete to all scales. That is : we assume our best about these forces and when observation doesn't fully agree we are tempted to 'invent' new things. But it might be an inexact understanding or improper extension. The conditions in the early universe our well out of current experimental range, we can only examine relic data and are thus limited by the frailties of that approach. Maybe it's not only gravity that needs a fresh look at with large scale ...

Cheers, Mike.

( edit ) I think it is the cosmologist Edward 'Rocky' Kolb who refers to dark matter/energy as 'epicycles', thus alluding to the tacking on of more circles to a theory already full of them. What was needed, per Kepler, was an ellipse. He's a wag for sure, so have a glance at this hilarious but shameless machine-gunning of political correctness from the late 1990's. :-) [ By that I mean : in Australia he would have been successfully prosecuted for racial/religious/ethnic/minority 'vilification' .... ]

( edit ) Addendum. How could anti-matter be involved in cosmic expansion when all we 'see' is matter? The key would be the extraordinarily low cross-section ( chance ) of interaction with neutrinos, of any type. You could have a universe brimming with anti-neutrinos and hardly feel it directly.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Here's an aspect of Minkowski geometry that deserves some mention. In Special or General Relativity you can actually have a metric of type :
[pre]
+1 0 0 0
0 1 0 0
0 0 1 0
0 0 0 1[/pre] provided that you accept an imaginary time co-ordinate. That is instead of :

time = t

you have :

time = it

where t is a real number in both instances and i is the square root of minus one. Yes, complex numbers. This approach seems to be either praised or deprecated depending upon who uses it eg. Hawking is quite keen on imaginary time, but not MTW say.

Quote:

A significant problem with the ( historically based ) words 'imaginary' and 'complex' is that they impart a sense of dread, or expectation of difficulty, when first encountered. As Penrose in 'The Road To Reality' repeatedly outlines, the real world at small scales is superbly and efficiently modelled by using these mathematical entities. With GR we are taking the limit of 'nearby' points, that is all spacetime geometry obeys Lorentz in the small. And they really aren't as hard to understand or use as they are often made out to be.

So the spacetime 'distance' differential still winds up to be :

ds^2 = - dt^s + dx^2 + dy^2 + dz^2

because i^2 = -1. Now apart from being a spot of clever algebra, where has this got us? We've converted :

(A) A 'space' with four real number components ( t, x, y, z ) and a Minkowksi metric.

to

(B) A 'space' with one imaginary and three real number components ( it, x, y, z ) with a Euclidean/Pythagorean metric.

Both have the same differential .... so the classification of world lines into time-like ( normal sub light speed ), space-like ( never seen or dis-allowed tachyonic ) and null ( light-like, at the border between the first two ) is unchanged. So subsequent theory - mechanics, interactions etc - don't change. Are we so desperate to stay with a classical metric that we go for a strange time definition?

This has annoyed me for several years. Here's my rough take on things, for what it's worth. I'm shooting the breeze on the night shift 'cos the traffic is slow ( and wet ). See, I've even got time to stuff about with BBCode, especial thanks to the new E@H web interface !! :-)

I've listened to Hawking in his resolution of singularities by the use of imaginary time. Generally speaking one can use complex extensions of real number functions to avoid poles of functions - where some are expressed as the inverse of some polynomial, and said polynomial has zeroes at certain points so that the inverse is 'infinite' or more properly simply 'undefined'. The poles are still there in the Argand plane but one can 'drive around them' eg. with contour integration. Integration is a key mathematical tool as it adds up lots of little changes ( like the progression of a particle's proper time ) over some history, world-line or whatever. The twins in the Twin's Paradox get 'integrated' over different spacetime paths for instance, to find out how their ages differ when they meet again.

Quote:

So if you want to integrate a function from a negative real ordinate to a positive real ordinate, but only along the real axis when there is a pole at zero ( say 1/x ), then you can't legitimately do that. But if you extend 1/x to being complex ( 1/z ) in the right way ( 'analytic' ) you can do the integration ( with some reasonable cautions ) and avoid the pole at the origin. If done correctly the integral will have a value ( total ) that is independent of the precise path along which integration is performed. Also Penrose gives a beautiful example of how the real roots of some cubic equations can be derived 'easily' ( or more so than otherwise ) by allowing complex numbers as intermediaries ( conjugates are taken to come back to the real line ).

So where are we going here? More clever algebra? Think of what it means to have an 'imaginary' ordinate. Take the Argand plane, reals along the x-axis and imaginaries up the y-axis. Grab any point on the plane at all, except the origin, and multiply it by i. It will be rotated around the origin by 90 degrees ( anti-clockwise, but that is simply by convention ). Do that twice and you reflect through the origin ie. z becomes -z. Do this reflection twice and you are back where you started, where indeed all these rotations are a group with a 2PI modulus. Sound even vaguely familiar?

Yup, spin. Specifically fermions and bosons, and the fact that by exchanging indistinguishable particles you can either subtract from quantum sums ( leading to exclusion of Fermions from identical states, due to 'opposite' phase ) or add within those sums ( leading to clumping or condensation of bosons, due to 'same' phase ). You could transition between fermionic and bosonic behaviour purely by pushing the 'multiply the time ordinate by i' button twice. Spin is a time rate of change, or at least can be linked to measurable macroscopic quantities like angular momentum which demonstrate that ( magnetic resonance imaging is a good example of this ). Quantum mechanical phase is never directly measurable, only differences thereof. I can use either the (A) or the (B) scheme above to represent reality, leaving all results known to date unchanged ( as the spacetime differential is invariant to the choice ).

So where are all these 'super partners' that we'd love to have around, in order to stop our integrals hairing off to infinity, instead of something sensible, when we want to sum over virtual particle interactions? I reckon they are always there but we'll never get to meet them directly. They are only one push of the i button away. They don't appear in our 'measurable' metric, by having a real time co-ordinate in our Euclidean metric ( or an imaginary one in the Minkowski. Choose either view ). But maybe they ought appear in the 'intermediate' calculations, even if only to be 'nulled' by conjugation prior to the final result ( recall that the complex conjugate of i is -i ).

Maybe the super-partners aren't distinct particles at all, just in a different part of a 'super phase' cycle. Instead of phase having a real value like the classic one dimensional back and forth of a pendulum swing, a super-phase would actually be complex and follow a unit circle ( centered on the origin ) in the Argand plane. We only 'see' the super phase at it's projection onto the imaginary axis ( or maybe the real ? hmmmm ) .... and call that component 'phase' by the usual meaning.

Silly me. Churning bandwidth and musing. :-)

Cheers, Mike.

( edit ) I wouldn't think the above is especially original. No doubt someone has already tried, and probably failed, along this track ..... maybe it abrogates causality or somesuch.

( edit ) Aside : Say I stick with a Minkowski metric but convert t to it - just for a single particle ( cheeky ). ( This is not either A or B above ). Then I convert a time-like ( detectable, sub light speed ) world line into a space-like ( tachyonic, undectectable ) one. For us the particle 'disappears'. Now take a tachyonic particle an 'rotate it in' by going from it to t. It 'appears'! Why? Well, it was always 'there' in spacetime but not in our light cone ( past or future ). Maybe this is how you 'create' and 'destroy' these guys. You don't have to talk of converting mass to/from energy ( though it looks that way in a given light cone ) you are just flipping them in/out of view. Mass/energy is still conserved. Remember 'mass' for us is a way of saying 'it should stay in our light cone'. :-)

( edit ) To be more precise you can swap b/w normal and tachyonic particle 'type' by going from a real velocity to a pure imaginary one. If you accept that the spatial co-ordinates don't change character, then you must attribute that change as being due to going over to imaginary time.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## RE: The real joke of "Why

)

Interesting.. Is it possible to understand and view the universe rationally without "concept". I think the chicken does it quite well. and they get along amazing well together:-)

There are some who can live without wild things and some who cannot. - Aldo Leopold

## I remember playing the

)

I remember playing the English Suites by JSB. Then I played an Olivetti Lettera 22 typewriter when I translated 12 books in order to pay for my mortgage and support my children. Now I only play a keyboard. Sob...

Tullio

## Now here's an interesting

)

Now here's an interesting relevant bit o' jommetry I hadn't fully realised the import of before. The vector cross product makes sense in 3 dimensions only. For four or more, one can't uniquely assign a result by generalising the 3-D version. The cross product construct is involved in the Lorentz force law with electromagnetism say.

In 3-D the right ( or left ) hand rule lets you have a single direction which is perpendicular to the other two vector arguments. In 4-D there is an infinitude of directions which are orthogonal to any two given ( non-parallel ) vectors. So ..... if there are more independent spatial directions than are evident to us in this Universe we'd need some extra rules applying to explain why cross-product-dependent behaviours occur as seen by us. Or : what general rule applies in higher dimensions that specialises to the cross product in 3-D?

You see GR doesn't tell us what happens with other forces. Indeed one can use the difference between the world lines of a test particle with an electric charge compared to one without, in order to deduce that a non-gravitational force is acting at all ( preventing 'free fall' ).

Any/all of this may be solved already. Just musing ..... :-)

Cheers, Mike.

( edit ) Well, there's a trivial sense in which the above must hold. Extra dimensions by definition give one 'more room to play with'. I'm sorta getting at why, when GR can apply to very many dimensions ( at least I understand that to be the case ), the non-gravity forces are as they are. One for the 'Theory Of Everything' I guess ....

( edit ) Or I'm a chicken who is totally missing the relevant road and it's sides. :-)

Interestingly 'concept' is a concept in and of itself. It's a way of 'placing brackets', like the curly ones used for set definition :

{1, 2, 3, 4, 5 ..... }

The objects inside have a separate 'existence' but the act of bracketing them, thus grouping, is a concept. But I'll stop there. I feel madness lies along that path - all men are liars, barbers shaving themselves etc. Thank you Bertrand Russell.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## From what I remember, in a

)

From what I remember, in a Hilbert space you have an inner product and an outer product. The first in a 3 space is the scalar product and the second is the vector product, which you call the cross product. A Hilbert space is a Banach space with an inner product. A Banach space is a linear normed space. All this from memory, so pardon any error.

Tullio

Edit: the linear vector space must also be "complete" to be a Banach space?

## RE: Or : what general rule

)

Yes, I think Tullio's right about the outer product â€“ I found a very nice tutorial here.

## Regarding the question about

)

Regarding the question about the about the antimatter, there's some interesting news from several of Fermilab's ongoing experiments, namely MINOS, MinniBooNe, and D0 - -

At D0 it looks like they've identified a CP violation (charge-parity) with neutral B mesons â€“ there appears to be an asymmetry between the way the B and anti-B mesons decay that deviates from the Standard Model prediction by 3.2 standard deviations.

At MINOS they measured the 'square of the difference between mass eigenstates' between muon neutrinos and tau neutrinos and came up with 2.35 x 10^-3 eV^2. A particle and its antiparticle should both have the same mass, right? When they measured the same difference between muon and tau anti-neutrinos they got 3.35 x 10^-3 eV^2. The statistical confidence is '2 sigma' â€“ for the particle physicists, 3 or 4 sigma is pretty close to certainty, and in this case would mean some serious head-scratching. The scientists had to fire 7 x 10^20 protons at a target to gather the data for the muon/tau neutrino measurement, so it will probably be some time in early 2012 before recording enough data for 3 sigma confidence (assuming the effect is real).

At MiniBooNe the researchers have already weighed-in at 3 sigma confidence with their measurements of a similar mass difference for lower-energy muon neutrinos as they oscillate into electron neutrinos (compared to the muon and electron anti-neutrinos).

Wouldn't surprise me if spacetime turns out to be quantized and left or right handed ...

How do neutrinos get locked inside atoms in the first place, or are they created during the decay?

## RE: At D0 it looks like

)

That's just "significant" by the usual measures. With a number of assumptions regarding behaviour of large numbers then 3 sigma 'means' that it is more than ~ 99.7 % likely that the effect of interest is not due to sampling variation ( a 'lucky streak' ). Or, if you like, if we repeated the experiment say, 1000 times we could attribute the ( same ) results from 3 ( 1000 - 997 ) of them due to random luck in sampling. The other 997 ( of 1000 ) we could not blame random variance upon. I don't really know why traditionally 3 sigma is the 'magic' line to equate to 'significance' ( I mean it's a fine choice for sure, but I just don't know the history of that ). Not all state it that way though, they just quote the sigma value and let others make up their own mind as to what level they will accept. 1 sigma is ~ 68.2 % and 2 sigma is ~ 95.4 % in normally distributed ( Gaussian ) statistics. 4 sigma is out at 99.99 % and each extra sigma from there on adds about another two 9's.

The biggest killer of this analysis is a systematic bias in measurement ie. an effect ( in your measurement system ) skewing the results in some direction. This can occur with radar guns, say - so to account for this ( or more likely to defray challenges in court ) in my state one is only sent a violation notice if the measured speed exceeds 3% above the allowed.

The main current model is that a mass eigenstate is an un-observable quantum state that when mixed with others gives an actual detectable neutrino. The idea is that as these eigenstates propagate they wax and wane so that the detectable mix ( which is what determines observation as an electron, muon or tau neutrino ) varies as you go along - from the Sun ( source ) to the Earth ( target ) for instance. Even worse, 'cos of quantum, a given eigenstate mix at some particular position only gives probabilities for observing a given detectable type. To deduce these probabilities/fractions one has to collect sufficiently large numbers of data points. Thus some eigenstates will be more likely to give electron neutrinos say ( if they interact at all ! ) rather than either of the other two. Mathematically this can be conceptualised in a vector space containing eigenvectors which have angles between them that specify the exact mixing character. The first was named after Nicola Cabibbo who first saw this connection ( ie. where would physics be without Italians, eh Tullio ? ) but generically all three are usually called the Cabibbo Angles. Well it's three if you believe there are only three neutrino types, which is another story again .... :-)

Anyway the consequence, via this scheme, is that it is OK to have a difference b/w muon neutrino and tau neutrino HOWEVER that ought be the same difference as b/w muon anti-neutrino and tau anti-neutrino. Why would the eigenstates not be invariant here ?! So yeah, if this firms up to 3+ sigma level - the scientific analog/version of the criminal/legal proof standard 'beyond reasonable doubt' - then the feline will well and truly be among the avians!! :-)

So ditto but more so as their methodology is rather different. Looking 'real' then .... :-)

It'll be subtleties like the above that will define that, if at all.

Well, I think/remember a neutron as being "equal" to a proton plus an electron plus a neutrino. But that's a 'black box' answer, a rule of thumb, as we only know what we measure. Who knows what they get up to when we are not looking ? :-)

Cheers, Mike.

( edit ) So the percentages aren't a statement about % 'right' or 'wrong' but about the likelihood of a result being due to a random sampling, being taken from a large possible population ( here infinitely large ). Of course if we could measure/know every interaction in the universe for all time then we'd have no need for statistics : you'd just state what happened with that complete knowledge. Failing that we make some reasonable assumptions - basically that the universe is not perversely structured simply in order to fool us. :-)

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## Thanks Mike, also in the name

)

Thanks Mike, also in the name of Nicola Cabibbo who was unjustly denied a Nobel Prize last year. I have personally known two Italian physicists who deserved a Nobel Prize and never got it, experimentalist Giuseppe "Beppo" Occhalini (who was one of the very few Italian professors who refused to take a loyalty oath to Mussolini and exiled himself in Brazil where he earned his money by acting as a mountain guide) and theorist Tullio Regge, of Regge pole fame. But to earn a Nobel prize you must speak English.

Tullio

## One other thought on the

)

One other thought on the matter/anti-matter issue : might the quoted asymmetry be equivalently stated as the weak force having a different distance dependence according to matter/antimatter - because it is the vector bosons ( Z0, Z+ and Z- ) mediating this behaviour. This could be 'seen' in the micro as differing masses and decay modes? But in the 'macro' perhaps this gives rise to an apparent 'force' that is evident as 'cosmic expansion'? Hoyle and others had a similiar argument in the late 1950's referring to an ever so slight difference in the magnitude of the proton vs. electron charge. Here about a 10^(-18) difference would be sufficient to explain the then measured expansion rate ( Hubble factor ). The idea fell over though, as direct measurements discounted the charge asymmetry to much lower levels.

The point is that dark energy/matter arguments pre-suppose that our knowledge of forces is complete to all scales. That is : we assume our best about these forces and when observation doesn't fully agree we are tempted to 'invent' new things. But it might be an inexact understanding or improper extension. The conditions in the early universe our well out of current experimental range, we can only examine relic data and are thus limited by the frailties of that approach. Maybe it's not only gravity that needs a fresh look at with large scale ...

Cheers, Mike.

( edit ) I think it is the cosmologist Edward 'Rocky' Kolb who refers to dark matter/energy as 'epicycles', thus alluding to the tacking on of more circles to a theory already full of them. What was needed, per Kepler, was an ellipse. He's a wag for sure, so have a glance at this hilarious but shameless machine-gunning of political correctness from the late 1990's. :-) [ By that I mean : in Australia he would have been successfully prosecuted for racial/religious/ethnic/minority 'vilification' .... ]

( edit ) Addendum. How could anti-matter be involved in cosmic expansion when all we 'see' is matter? The key would be the extraordinarily low cross-section ( chance ) of interaction with neutrinos, of any type. You could have a universe brimming with anti-neutrinos and hardly feel it directly.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

## Here's an aspect of Minkowski

)

Here's an aspect of Minkowski geometry that deserves some mention. In Special or General Relativity you can actually have a metric of type :

[pre]

+1 0 0 0

0 1 0 0

0 0 1 0

0 0 0 1[/pre]

provided that you accept an imaginary time co-ordinate. That is instead of :

time = t

you have :

time = i t

where t is a real number in both instances and i is the square root of minus one. Yes, complex numbers. This approach seems to be either praised or deprecated depending upon who uses it eg. Hawking is quite keen on imaginary time, but not MTW say.

So the spacetime 'distance' differential still winds up to be :

ds^2 = - dt^s + dx^2 + dy^2 + dz^2

because i^2 = -1. Now apart from being a spot of clever algebra, where has this got us? We've converted :

(A) A 'space' with four real number components ( t, x, y, z ) and a Minkowksi metric.

to

(B) A 'space' with one imaginary and three real number components ( it, x, y, z ) with a Euclidean/Pythagorean metric.

Both have the same differential .... so the classification of world lines into time-like ( normal sub light speed ), space-like ( never seen or dis-allowed tachyonic ) and null ( light-like, at the border between the first two ) is unchanged. So subsequent theory - mechanics, interactions etc - don't change. Are we so desperate to stay with a classical metric that we go for a strange time definition?

This has annoyed me for several years. Here's my rough take on things, for what it's worth. I'm shooting the breeze on the night shift 'cos the traffic is slow ( and wet ). See, I've even got time to stuff about with BBCode, especial thanks to the new E@H web interface !! :-)

I've listened to Hawking in his resolution of singularities by the use of imaginary time. Generally speaking one can use complex extensions of real number functions to avoid poles of functions - where some are expressed as the inverse of some polynomial, and said polynomial has zeroes at certain points so that the inverse is 'infinite' or more properly simply 'undefined'. The poles are still there in the Argand plane but one can 'drive around them' eg. with contour integration. Integration is a key mathematical tool as it adds up lots of little changes ( like the progression of a particle's proper time ) over some history, world-line or whatever. The twins in the Twin's Paradox get 'integrated' over different spacetime paths for instance, to find out how their ages differ when they meet again.

So where are we going here? More clever algebra? Think of what it means to have an 'imaginary' ordinate. Take the Argand plane, reals along the x-axis and imaginaries up the y-axis. Grab any point on the plane at all, except the origin, and multiply it by i. It will be rotated around the origin by 90 degrees ( anti-clockwise, but that is simply by convention ). Do that twice and you reflect through the origin ie. z becomes -z. Do this reflection twice and you are back where you started, where indeed all these rotations are a group with a 2PI modulus. Sound even vaguely familiar?

Yup, spin. Specifically fermions and bosons, and the fact that by exchanging indistinguishable particles you can either subtract from quantum sums ( leading to exclusion of Fermions from identical states, due to 'opposite' phase ) or add within those sums ( leading to clumping or condensation of bosons, due to 'same' phase ). You could transition between fermionic and bosonic behaviour purely by pushing the 'multiply the time ordinate by i' button twice. Spin is a time rate of change, or at least can be linked to measurable macroscopic quantities like angular momentum which demonstrate that ( magnetic resonance imaging is a good example of this ). Quantum mechanical phase is never directly measurable, only differences thereof. I can use either the (A) or the (B) scheme above to represent reality, leaving all results known to date unchanged ( as the spacetime differential is invariant to the choice ).

So where are all these 'super partners' that we'd love to have around, in order to stop our integrals hairing off to infinity, instead of something sensible, when we want to sum over virtual particle interactions? I reckon they are always there but we'll never get to meet them directly. They are only one push of the i button away. They don't appear in our 'measurable' metric, by having a real time co-ordinate in our Euclidean metric ( or an imaginary one in the Minkowski. Choose either view ). But maybe they ought appear in the 'intermediate' calculations, even if only to be 'nulled' by conjugation prior to the final result ( recall that the complex conjugate of i is -i ).

Maybe the super-partners aren't distinct particles at all, just in a different part of a 'super phase' cycle. Instead of phase having a real value like the classic one dimensional back and forth of a pendulum swing, a super-phase would actually be complex and follow a unit circle ( centered on the origin ) in the Argand plane. We only 'see' the super phase at it's projection onto the imaginary axis ( or maybe the real ? hmmmm ) .... and call that component 'phase' by the usual meaning.

Silly me. Churning bandwidth and musing. :-)

Cheers, Mike.

( edit ) I wouldn't think the above is especially original. No doubt someone has already tried, and probably failed, along this track ..... maybe it abrogates causality or somesuch.

( edit ) Aside : Say I stick with a Minkowski metric but convert t to it - just for a single particle ( cheeky ). ( This is not either A or B above ). Then I convert a time-like ( detectable, sub light speed ) world line into a space-like ( tachyonic, undectectable ) one. For us the particle 'disappears'. Now take a tachyonic particle an 'rotate it in' by going from it to t. It 'appears'! Why? Well, it was always 'there' in spacetime but not in our light cone ( past or future ). Maybe this is how you 'create' and 'destroy' these guys. You don't have to talk of converting mass to/from energy ( though it looks that way in a given light cone ) you are just flipping them in/out of view. Mass/energy is still conserved. Remember 'mass' for us is a way of saying 'it should stay in our light cone'. :-)

( edit ) To be more precise you can swap b/w normal and tachyonic particle 'type' by going from a real velocity to a pure imaginary one. If you accept that the spatial co-ordinates don't change character, then you must attribute that change as being due to going over to imaginary time.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal