Paul Laske described the pulsar timing arrays in operation for ~ 10+ years now. The idea is to use the ( extremely subtle ) shift in signal arrival times to deduce the local passage of a very long wavelength gravitational wave. The expected natural frequency of such things is about a full but single wave cycle per decade or so ! :-0
Wow ! So that is well away from the frequency range we have ever discussed here at E@H.
Colin MacLaurin spoke of a ( IMHO ) quirky construct : the relativistic analysis of a hypothetical cable connecting two galaxies. How would that behave given that the universe is expanding ? This may seem a dumb question on the face of it, but it tests how one can relate GR/SR concepts into simpler classical terms. What I mean here is that if one is going to use GR to claim that gravity is not a force but a spacetime deformation, then one ought be able to run that backwards to transition from an esoteric 4D geometric evaluation to yield a good old who pushed/pulled upon who.
Last and certainly not least I will mention the ideas of a Russian GR theory team via Oleg Tsupko. Take the scenario as demonstrated in 1919 : the deflection of light rays as they graze past the Sun, which happens every day of course but is hard to show in the absence of the Moon masking the glare. The model may then proceed to greater complexity given that virtually all ( main sequence ) stars spent most of their multi-billion year lifetimes sitting in the 'steam' that they as a 'nuclear kettle' produce. This steam is plasma, the corona being a good example. Our Sun's corona extends over several photosphere diameters and is quite hot, about several million Kelvin ( compare with ~ 6000K for the visible spectrum surface ). I got lost in detail mid-talk but I rallied at the end ( mid-afternoon brain fade ) to get the gist. Their models are eminently testable if anyone cares to examine either data sets obtained to date ( gravitational lensing of distant galaxies ) and/or produce a study with their particular models ( a few variants ) in mind ie. do they work or not ?
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
- there are a heck of a crew of very talented people in this area. Their degree of fascination ( a laymen might impolitely say obsession ) with the topics is Epic Legend IMHO.
- I was pleasantly stunned and/or flummoxed by the sheer intellectual density of the ideas on offer.
- after a couple of quick/quiet queries I established that most participants did not fully ( sometimes not even vaguely ) understand the others. For some strange reason I expected that the participants would have a very high level of common 'au-fait'-ness. Silly expectation really. Numpty Mike on that one. I should refer to my own profession where I have literally listened to two or more doctors relate, knowing full well that there a massive cognitive dissonance on display.
Nuff said. See if my neurons survive tomorrow. Like a fading dream I've thrown all this at you while I seem to hold some coherence ( dare I say balance ) on what I've seen and heard.
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
Alas I was afflicted with a gastro thingy ( in all fairness probably not due to Monash Uni student cafeteria food, but the temporal association exists ) from mid-Thursday onward and so I missed the banquet and the Friday sessions. So for Thursday morning at least I can report :
Bob Wald's talk on black holes and branes ( higher dimensional analogues of holes ). I followed the initial thermodynamic stuff. Basically as Boltzmann indicated : almost always entropy increases in isolated systems. Thermodynamic temperature can be defined as the inverse of the rate of change of entropy with energy. Now I had a brief deja-vu moment as his initial logic was eerily familiar to that of Max Planck ( deriving quantum behaviour from black-body radiation data circa 1900 ). In any case he linked the dynamic stability of black holes ie. will they persist as such, in the preconditions he specifies, to thermal behaviour. This is not an unusual conclusion to draw really : if a bomb explodes then it is both dynamically and thermodynamically unstable ! Ditto for implosion.
Here's a thought. If you let a blob of water radiate ( nett ) heat away then what happens to the temperature ? It cools of course in the sense of the average kinetic energy of the molecules reduces. Blow on hot soup to cool it etc. If you put energy in eg. microwave your coffee, it will heat up. This is known as positive heat capacity in that heat input and temperature track together.
If a star radiates energy away what happens ? Think carefully now. It will shrink ( eventually, think long time scale ) and the centre will get hotter. So energy loss gives a temperature rise. This is phrased as a negative heat capacity. No conservation law is breached because the Sun is utilising nuclear ( potential ) energies via fusion to do this, which is a mechanism not available to the water in your soup/coffee.
This is very much the key issue. For stars gravity always attracts and the tapping of fusion energies only forestalls the inevitable collapse ( of some version ). What Mr Wald has done is generalise known black hole physics in this regard to more than the currently known 3 + 1 dimensions. Not an stunning deduction but I guess there would be concern if found otherwise.
In a more general scenario of Mr Wald's ( and others ) eternal inflation pretty well makes nonsense of what we might mean by 'system', 'system boundary' and 'almost always' in thermodynamic discussion. Recall that thermodynamics arose from the study of steam engines etc. Can we throw/propel that logic from Victorian engineering to super-spatial entities ? ;-)
More later.
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
Dr Bell's topic of The Particle Physics of Dark Matter, and Beyond :
Quite interesting. Some years ago Andrei Sakharov came up with a set of general conditions that must be satisfied ( at some time in the past ) in order for there to be an excess of matter vs anti-matter in the universe. Currently we measure or produce anti-matter only very very occasionally within our visible horizon and so an explanation for that was sought. In short the Sakharov conditions dictate ( temporary ) breaches of symmetries and an interval of time when when the universe was not in equilibrium.
Dr Bell's hypothesis is that the Sakharov conditions might have applied to the production of an as yet unknown class of particles in the category known as Weakly Interacting Massive Particles ( WIMPs ). Weakly interacting here means that they interact via the weak nuclear force. Massive means two things really (a) they have a rest mass at all and (b) they are rather heavy in comparison with 'ordinary' matter. Some particle(s) about five times the mass of a proton/neutron would fit the slot nicely ( an estimate based on known cosmological parameters ). The factor of five comes in as the dark to normal matter ratio is approx 25% to 5% ....
Think of it this way. Suppose I have a machine that creates pies and anti-pies ( I like steak and kidney, but cream & custard is fine too ). If a pie and an anti-pie meet they annihilate and you get a taste explosion. What is more is that we can make (anti-)pies that can only be eaten when held in the left hand, and likewise (anti-)pies eaten with the right. Normally if I make a right handed pie then also a left handed anti-pie comes out too ie. pie-ness and handed-ness are conserved.
However each pie or anti-pie can also 'decay' into two or more other smaller/lesser pies of different varieties. So a single cream&custard pie might become a cream pie and a custard pie. Likewise an anti-steak&kidney pie may decay to an anti-steak and an anti-kidney pie.
So imagine if my pie/anti-pie machine malfunctioned. For a while it makes more anti-pies than pies. It shouldn't but it does, and by doing so breaks one of the Rules Of Pies that says the number of pies minus the number of anti-pies should be zero. Furthermore we find that the rate of pies decaying into smaller pies is slower say than the rate of anti-pies decaying into smaller anti-pies. Throw in that the weak force is asymmetric in terms of mirror symmetry anyway then one could have, say, more left handed steak pies than right handed anti-steak pies. So far what would happen when we turn the pie machine off is that things would in time settle out if left alone.
Now we throw in the idea of ( thermodynamic ) equilibrium. Suppose we look at a cream&custard pie breaking up into a cream pie and a custard pie. Could that operation go the other way, that is if I throw a cream pie at a custard pie then might a combined cream&custard pie re-form ? Yes it could. But there is a special problem here. When I break a combined pie into two separate ones there will be a point in space and time ( an event vertex ) for that. That vertex could be anywhere/when. However if I start with a custard pie and a cream pie, wanting them to collide and form a composite, then it must be 'arranged' that they meet at all. It is not enough to have each small pie in the same universe, they must be sufficiently nearby in order to react at all.
The weak nuclear force is very local. Very early on in the universe things were close together and thus going back and forth across reactions was no trouble because of proximity. The universe expanded but without the range of the weak nuclear force increasing and so in relative terms an opportunity arose for reactions to easily go one way - composite (anti-)pies breaking down - but rather less opportunity for the reverse reaction - small pies forming bigger ones. So this is one way of stating the 'out of equilibrium' part of the Sakharov conditions. If you like : there are vastly more ways of two particles to be separated in 3D space than there are ways for them to be in the same place. [ That's pretty well the idea of increasing entropy in a nutshell BTW ].
[ side note : matter and anti-matter [strike]particles[/strike] pies only annihilate if of exactly the same type. So an anti-cream pie does not produce a taste explosion with a steak pie. Anti-cream & cream annihilate. Steak and anti-steak annihilate. ]
Back to our universe at present day. What might we see if any of the above is relevant ? The first has been mentioned : dark matter WIMPs of around the same order of mass magnitude of normal known nucleons. Assuming a WIMP halo distribution around our galaxy then as we speak we ought be drifting through said cloud to some degree. Can we detect a WIMP 'wind' in the sense of a rhythmic variation in certain reaction types on a diurnal and annual basis ? We should hit more WIMPS if travelling against the wind vs going along with it. Preliminary results from a number of experiments sensitive to this aspect have indicated YES, in that while sufficient ( five sigma ) significance has not yet been reached there is an increasing significance the longer these studies have run. One slide showed around three-ish sigma. Watch that space ! :-)
Cheers, Mike.
( edit ) 'Sigma' significance in this context is a measure of how likely it is that random signal fluctuations might fool you into thinking it has a pattern. The higher the sigma the less likely the fooling. One is never certain really but it is reasonable to produce a statistical measure which we may interpret as 'reliability of true signal'. For example some of our pulsar detections have come in ( after careful analysis of error ) at some 40 sigma. The sigma scale is not linear. For gravitational wave detection claims the LIGO preferred level is some 20 sigma : an incredibly strict/harsh standard of physical truth.
( edit ) Side issue. The force interaction strengths which are crucial in terms of relative reaction rates when out of equilibrium also depend on energy scale. Or if you like are temperature dependent. So the 'fine structure constant' (an historical name ) sets the overall magnitude of an electromagnetic interaction ie. b/w any charged particle and a photon. The everyday value is around 1/137 but there is a trend - as disclosed over decades of particle collisions - for this to increase as interaction energies ( = temperature ) goes up. You may have heard that the various non-gravitational forces 'merge' or 'unify' at high temperature. What this means that if we extrapolate known low energy behaviour ( the best we can produce here on Earth ) to enormously higher energy then some optimistic curve fitting predicts that the interaction strength for said forces become on par if not exactly equal ( much debate ). So a matter particle at such extra-ordinary energy will be rather indifferent, as it were, to whether a given event is mediated by photons ( EM ), vector bosons ( weak nuclear force ) or gluons ( strong nuclear force ). That is 'weak' and 'strong' for instance are our low energy labels. A typical phrase used to describe the force strengths diverging from one another when the temperature goes down is 'symmetry breaking'. That is a way of saying they were ( much ) the same and now they are ( rather ) different. As a semantic point it is possible that symmetry is possibly an over-used word I think.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
I had forgotten to comment upon the topic "Interpreting the Mathisson-Papapetrou equations" by a Mr Norton, the description of which is somewhat dense. It amounts to an attempt to explain the basis of quantum mechanics with a more classical approach, at least for the electron. It is a variant on a model that Louis de Broglie used nearly 90 years ago. "Zitterbewegung" is a German word for "trembling motion". "Pilot wave" is a term of similar vintage used to describe a field guiding the movement of particles. These days a particle is the field or a lump thereof and so particle and field are not separate ie. quantum field theory. Specifically the thrust of Mr Norton's presentation/arguments is to establish quantum mechanics as emerging from a more 'hidden' underlying particle theme. Hmmmmm ..... one might say. Now as presented his ideas are not complete enough, as readily admitted by the author ( I think he said "not obviously wrong" while smiling ), so we'll leave the specifics of the talk be. Don't be harsh on the presenter here. QM does have its problems and so whatever might be found to resolve them is going to come from an off-beat direction. All the fairly obvious approaches have been tried.
It does revisit a general idea though : might all this QM stuff, nonsense in a classical sense with its anti-intuitive screwy logic and probabilistic character, really be an as yet-to-be-deduced-but-sensible classical scheme ? Recall that most of the original architects of quantum mechanics were frequently unhappy with their production. Not because it didn't work. It worked famously well and has since moved from strength to strength. But because it didn't gel well with their 'natural' intuition.
Now whatever underlying particle model that could be created has to cope with a very important scenario : quantum entanglement. There is now considerable experimental demonstration of this pioneered by Alain Aspect ( French physicist ) working from analysis by John Bell ( Irish physicist ). This is a very special behaviour that currently only exists in QM with no classical analog. As it takes some discussion to prepare the discussion then please bear with me. Plus the usual dragon caveat applies. :-))
How do you define a particle ? A local entity is the typical answer with well established specific 'intrinsic' features like ( rest ) mass, spin, charge, magnetic moment, other force quantum numbers etc ( say 'quarkiness' ) depending. This particle model is the basis for much success where, say, forces are viewed as particle exchanges. Each interaction vertex is a place and time, thus amenable to relativistic treatment. Any and all conservation laws, quantum transitions etc must be resolved in a per-vertex fashion. We build up a higher level description by creating a web of cause-to-effect nodes/vertices ( quantum electrodynamics and beyond ). Orderly. Explicable. Particularly : the only influences that transmit cause-to-effect are the travel of particles b/w spacetime events. Notice the not-random adjacency of the words 'particle' and 'particular' here. :-)
Probability comes in as a quantitative descriptor derived from complex numbers known as amplitudes. Indeed John von Neumann ( Hungarian born pure genius mathematician ) demonstrated early in his career the formal correspondence b/w the complex number ( Argand ) plane and QM. Which is why if you are doing QM in any more than superficial detail, you have to understand complex numbers thoroughly. You might think that 'quantitative' is an odd way to talk of probability but it is indeed quite appropriate. For a large number of identical/likewise prepared situations the aggregate numbers produced follow the simple rule :
Quote:
expected amount = probability * repetitions
... for whatever physical quantity is being measured. As per classical probability theory. The quantum rider is : often a measurement can only be one of a discrete set, somewhat like one energy level from a ladder of choices that has gaps between the rungs. A discrete spectrum with some/most values forbidden. So for non-pure initial states - a mix of several pure spectral states - the weighted mean ( called the 'expectation' ) is not going to be any specific spectral amount. Over very many trials the expectation is going to usually lie between one quantum level and the next eg.
weighted mean of [(five lots of 3.0) with (four lots of 2.5)] = [15.0 + 10]/(five lots plus four lots) = 25/9 ~ 2.78
.. which is neither 3.0 nor 2.5. There is no single measurement with the value 2.78, each is either 2.5 or 3.0 in this example. So here lies the rub. You can expect whatever QM calculates but a single measurement will ( almost ) never reflect that. Expectation, averages, probabilities as calculated or recorded numbers only approach each other as the count of repetitions goes to infinity as a limit. The reason why classical looks 'classical' is that for systems with lots of things ( repeated in time, space or instance ) that limiting process has effectively occurred. Thus we never sensed the quantum probability aspect until, historically, we could sense small numbers of things and deduce its per-case uncertainties.
Which brings us of course to Mr Heisenberg and his famous principle, which has two related aspects :
- for a single small entity under measurement you have to upset it and hence change the future course of events because you 'looked'. The harder you look the more you change the future of that particle. You can determine a particle's energy, say, to any degree of accuracy you prefer but limited by your exhaustion in waiting for the answer. Or you can localise a particle to some place arbitrarily exactly but thereafter have no idea where it went.
- for groups of particles ie. repetitions of instance, the group statistics follow the inverse character too. So if I get a narrower variance ( the square of the standard deviation ) b/w measurements of one type there will be a corresponding wider variance of what is called the conjugate variable.
These distinctions between expectation and probability, between individual examples and group trends is crucial for the entanglement discussion. Thus endeth the preamble.
Albert Einstein, Boris Podolsky and Nathan Rosen (EPR) theoretically analysed quantum scenarios with a view to finding flaw. Which they apparently did, but it was later realised they found something very deep and true about QM. In software terms you could say what was first thought of as a 'bug' was actually a 'feature'. :-)
Take an interaction vertex, for discussion we will use an electron colliding with a positron, but there is generality represented here. You get two photons of exactly the same energy firing off in exactly opposite directions ( at least if considered in what is called the centre-of-mass reference frame ). Each photon has a property called spin that has a 'polarity' behaviour. If I put a 'polarity analyser' in its way I can establish whether it is 'up' or 'down'. The technical detail is not important here, the fact that I get one of only two possible answers from measurement is important.
But to be necessarily excruciating my polarity analyser has a third result : nothing is recorded. If you like non-measurement is a possible measurement outcome. Think carefully now. How do I distinguish as an experimenter b/w these two processes, say during some five second interval :
- a photon went by but my machine did not respond.
- no photon went by.
... as the machine does not respond in either case. So the answer by logical positivism is : you can't, do not and won't. :-)
Now for our two fleeing photons ( like bank robbers splitting up and departing the crime scene ) then to conserve spin one must have the opposite of the other's. If one is 'up' the other is 'down', where the words 'up' and 'down' however refer to a specific axis of measurement ( perpendicular to the photon path ) determined by measuring devices. Observation reveals the opposite spin sense for each in a pair.
Big deal you say. No mystery at all. At the annihilation vertex a decision was made as to whom would be 'up' and who would be 'down'. And randomly so in that over many measurements ( NB repetition ) along any given line passing through the vertex roughly equal numbers come out 'up' as 'down' provided that you watch for long enough. So one could think that our robbers tossed a coin at the bank to decide who would be what spin. We just found out later. This explanation is called a 'hidden variable' formulation, Some yet to be specified mechanism locally determined later separated measurements.
Now EPR said that could not be the case as per the quantum formalism of the time. The production of those two photons is a special case in the QM maths : only when one photon is measured is the spin determined and thus only then can the other's be too. The mathematical expressions cannot be factored to represent each photon individually. There is an expression that necessarily has variables for both photons locked in it. This hit the theorists like a train. Bohr was frantic. Pauli was highly annoyed. Dirac was befuddled. But no fault in the EPR logic could be found. IF one accepted QM's formalism THEN this entanglement must happen also.
For those that measure these photons ( produced by this annihilation route ) it means that the choice of polarimeter axis at one end determines the possible response of the other polarimeter to the other photon ! What is even more amazing is that one can setup the scenario to choose the polarimeter measurement axes after the photons have left the vertex ( after the robbers left the bank ). Delicate timing indicates - to a high degree of confidence - that any transit of the effect of axis choice at one end must propagate ( if such a word is the right one to use, see later discussion ) many times faster than light speed. I've seen one group quote 20 times light speed as the minimum rate per their degree of timing. All these behaviours are deduced as occurring long after all the shouting dies down, as it were. The 'late' choice ( mid photon flight ) of polarimeter axis can even be determined stochastically : by some other apparently unrelated method like the temperature fluctuations of a cup of tea. Only in retrospect can one say 'the speed of the effect is ... ".
Experimentally one cannot say that a given specific photon pair is 'cheating' at the time, so to speak, via entanglement ( the issue is partly the ambiguity of polarimeter non response as above ). So no faster-than-light signal method is available here. The analysis is always post-hoc and only group statistics divulge the entanglement aspect. A fairly simply derived ( Bell's ) inequality gives the limit on group behaviours on the assumption of hidden variables ( those invisible gears yet to be discovered ). In one setup the experimental result versus Bell prediction means that 1/4 must be greater than 1/3 - an obvious problem.
Suck it up. Entanglement happens. Particles under QM behave non-locally. Two photons may behave as one ( ie. without intervening spacetime placing a communication speed limit between them ).
Digest this if you have nought else about. More after Christmas.
Cheers, Mike.
( edit ) Zitterbewegung as described by Mr Norton brings to my mind the word "whirlygig". The electron ( a point-like entity with dimension zero ) follows a very tight constant radius helix at a given momentum. The axis of the helix being what we would label as 'the path of the electron' and it's momentary centre of orbit being 'the position of the electron'. This readily yields a 'phase' aspect needed in QM ie. at what part of the orbital arc is the electron at ? This opens up the possibility of phase differences & contrasts etc needed for group behaviours like diffraction and interference.
Also if you sit at one polarisation analyser looking back toward the annihilation vertex and on towards the other anaylser, then a rotation angle can be defined showing how the 'up' axes of the polarimeters compare. This mutual angle can vary smoothly from zero right around to a full circle. There are hidden variable models aplenty that agree with experiment for aligned ( zero degrees ), crossed ( 90 degrees ) and anti-parallel ( 180 degrees ) mutual angles. The entanglement proof comes from multiple measurements at intermediate angles. To be correct a certain number must be both :
- no greater than 0.25, and also
- no less than 0.35
If you like this is the 'paradoxical' EPR. Which is the experimental signal that hidden variables can't cut it. I reiterate : QM does not force any photon to respond in a specific way to the measurement. Probability still rules. What QM implies is a special relationship between the two photon measurements as described, such that in effect their spin senses are instantly determined despite any amount of separation. For all we know, two such photons could travel light years from an matter/anti-matter event and still bear this character .....
Maybe this is how Santa can do so much on a single night ? :-0
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
OK, quick replay : we have two particles emerging from a common interaction/event vertex, and by satisfying some conservation law they must have some correlated properties. It was spin angular momentum in the last post but could have been momentum or energy for example. Knowledge of the value for one particle allows deduction of the other. However the choice of measurement mode for one particle restricts measurement options for the other : regardless of their separation and light travel time b/w the two instruments. As far as we can measure to date this is effectively instantaneous or more precisely has no yet known upper limit on this 'entanglement' effect. Even more we can only deduce that aspect by very many measurements and thus it becomes a statistical comment on replicated cases. The rubber meets the road here by absolute inequalities that cannot and do not allow overlap, requiring sufficiently good technique ( basically best isolation of the system under study and high quality detectors ) to lay that bare eg. 1/4 is not greater than 1/3. Not being an individual particle comment rules out any faster than light signalling system. Both cruel and clever for Nature to be that way. See Catch-22 ..... :-)
Now clever punters might note that while we have phrased the situation as entangled particles, might it be equivalent to say entangled measuring equipment? This idea is born from the fact that if not for the equipment then we would have no understanding at all. What is thus being super-luminally 'transmitted' as it were, is the allowable measurable margin(s) of some quantity. As mentioned this is only for conjugate quantities that would otherwise be related by a Heisenberg inequality eg.
.... here x and px are the position and momentum along the same line axis, the delta's indicating the population variance for very many examples. The upshot here is that if, say, a momentum measuring device is so good as to really narrow down the value over repeated instances of measurement THEN the position measuring device is going to be inversely poorer at it's variance for those same repeated instances*. So I could have also written :
All forms are both equivalent and allowable as none of the terms are ever zero, so specifically they can be the denominator. Can it be the case then that the entanglement 'conspiracy' is a function of all players : the particles, the gadgets and for that matter the humans running the tests ?
The last is the easiest to rule out. Just automate the apparatus. Go fishing for a weekend or three and read the summary on return. I guess it would be an awfully long, long throw to connect experimental results to hobbies. It might be true of course. But how far do you want your paranoia to go ? I say paranoia because if one reads the literature or discussion on this - amongst the relevant practitioners - the topic is somewhat anthropomorphic. So you might read statements which imbue intent from non-organic things. This is not literally meant. What is being done is design of tests to exclude criticism of outcomes which might be apparently 'subconsciously manipulated'. I know that sounds really odd when I put it like that but I suppose that is the level of difficulty of the topic. Nuff said ...
That leaves the particle/gadget duo then. This is a more natural thought. What is a measuring device, in this context, but a way of magnifying quantum scale events to human scale evidence ? That can be seen objectively as a sequence of event multiplications. Suppose a photon hits a semiconductor crystal in a photodetector. That promotes one or a few electrons to a higher energy state in the lattice. Because of the way the lattice was prepared ( manufacture with controlled impurity fractions, bias voltages, high & low resistance pathways etc ) an avalanche is triggered. You did not directly notice the very last snowflake that fell on the mountain side, but by heck you can see tonnes of snow coming your way ! That sort of thing.
So the purest way to consider an entanglement scenario is the original particle production vertex plus two other distinct spacetime separated events that we deem as measurement instances. In the parlance of special relativity some of those events have a 'spacelike interval' b/w them. A 'timelike interval' is where one event lies in the light cone of the other and so we allow a paradigm labelled as cause and effect. Hence a spacelike interval means that cause-and-effect does not apply. The word acausal meaning 'without cause' is also used.
I have an example in mind to share more with you. I need to make a few diagrams though.
Cheers, Mike.
* I explain here to such excruciation out of necessity. You may not have met the full horror of Heisenberg's thinking/analysis beyond glib summary.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
Alas no simpler example suffices. That is the way of it. Hence the conundrum.
Try this visualisation : a flat piece of paper. The surface of which represents a physical 2D space in which we will discuss the matter :
It's linear units are of length :
In the centre is the vertex that produces an entangled photon pair :
that hence move away each along a line from that point. Conservation laws dictate that those trajectories must be in opposite directions :
Let the ( otherwise identical ) photons for example be detected some distance ( not necessarily equal ) from the vertex that generated them :
At the vertex make a linear fold in the paper perpendicular to the common line upon which the photons have traveled.
We will fold the space upon itself and in doing so the trajectories will come to lie next to each other along their entire length. If the paper was translucent/transparent then after folding you could clearly see that ( sorry about the strobe from the fluorescent tube light box ).
If we started a clock at the instant of particle production then each photon along each of it's lines would match the other distance for distance from that production vertex. In other words in the folded format there would be no intervening 'distance' between as they move. So an event ( eg. what we denote as 'detection' ) that determines whatever feature that we deem as entangled ( spin, polarity, whatever ) is resolved in a singular fashion ( here enacted by stabbing the paper with a needle )
Hmmmmmm. So the cheat here is that I have got around the 'spooky action at a distance' issue by a geometric trick.
Now this is just a spot of origami. Not a theory. Not a breakthrough. Nothing to generalise without contradiction. Nothing to get excited about. Basically a visual trick to remember/highlight the key sense of the problem :
IF you want to keep cause & effect as a reliable descriptor of reality when we model it ( with light speed the maximum rate of transfer of that ) THEN entanglement implies that we must invoke other dimension(s).
In this case the other dimension - so we are now 3D - is the one I folded the paper within. Well you could perhaps squirrel out of the concept by saying that you are identifying ( decreeing to be equal ) the corresponding points on the paper, but that is logically equivalent to the fold as described. Said extra 'dimensions' need not be either length or time or even have the character of any currently known physical unit type. It just has to serve to give the appearance in our reality of the entanglement behaviour.
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
This explanation of entanglement requires that there be at least one extra dimension. Oh dear! I perceive a slippery slope leading to strings.
Aarrghh ! Yes we don't want to do that ! We'll wind up mud wrestling at the bottom. :-)
Fortunately said extra dimension(s) don't have to be 'physical' like we are used to. Dimension can be another way of saying 'an extra degree of freedom'. A good example is quantum mechanical phase, which is just a number without known physical units. Or if you like it is measured in it's own unique units, but with the cyclic property of modulo arithmetic. So to visualise geometrically phase is a dimension closed in upon itself, a wee circle*. You can't measure any QM phase value directly but phase differences may be disclosed by suitable apparatus.
The interesting stuff comes from when different dimension types are related. Speed is a good example. It is distance divided by time. For light that is a constant regardless and so that fixed ratio determines much of the characteristics of our experience of this universe. A speed value is input into kinetic energy and thus enters a set of calculable quantities that as a group ( in some physical scenario ) preserve their total amount ie. conservation of energy.
{ Of course you can link distance, time and phase all in the one. This is the bargain basement as it were, because all of the QM relations blossom from this once you include SR with mass/energy equivalence. }
It has been shown that Beryllium ions show correlation/entanglement. This is significant as one has negligible chance of non-detection by instruments here ie. a Beryllium ion will plow into and be stopped by a lattice no problem. Not like a photon polarimeter say where a non-trivial fraction will sail right through without a blip. So that really narrows down the statistics and hence the gap b/w measurement and prediction via Bell's Inequality ( that assumes locality or hidden variables ) is wide. I think it was the NIST people in Boulder, Colorado that did that in the last decade.
In explaining this I think we are deep down stuffed. We are within the thing we try to describe and can't pop out for an 'external' view.
Cheers, Mike.
* A neat twist ( pardon the pun ) on this a type of dimension attribution to quantum properties is spin. A spin one-half particle has to be rotated twice to return to original phase. If you take neutrons and handle them via a magnetic field then this can be demonstrated by the resulting interference of rotated vs. unrotated particles { to be exact the 'spin' and 'rotate' words here don't have the easier classical meaning }. Anyway a Fermion's phase loop has more the characteristics of a Mobius strip than a simple circle.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
Paul Laske described the
)
Paul Laske described the pulsar timing arrays in operation for ~ 10+ years now. The idea is to use the ( extremely subtle ) shift in signal arrival times to deduce the local passage of a very long wavelength gravitational wave. The expected natural frequency of such things is about a full but single wave cycle per decade or so ! :-0
Wow ! So that is well away from the frequency range we have ever discussed here at E@H.
Colin MacLaurin spoke of a ( IMHO ) quirky construct : the relativistic analysis of a hypothetical cable connecting two galaxies. How would that behave given that the universe is expanding ? This may seem a dumb question on the face of it, but it tests how one can relate GR/SR concepts into simpler classical terms. What I mean here is that if one is going to use GR to claim that gravity is not a force but a spacetime deformation, then one ought be able to run that backwards to transition from an esoteric 4D geometric evaluation to yield a good old who pushed/pulled upon who.
Last and certainly not least I will mention the ideas of a Russian GR theory team via Oleg Tsupko. Take the scenario as demonstrated in 1919 : the deflection of light rays as they graze past the Sun, which happens every day of course but is hard to show in the absence of the Moon masking the glare. The model may then proceed to greater complexity given that virtually all ( main sequence ) stars spent most of their multi-billion year lifetimes sitting in the 'steam' that they as a 'nuclear kettle' produce. This steam is plasma, the corona being a good example. Our Sun's corona extends over several photosphere diameters and is quite hot, about several million Kelvin ( compare with ~ 6000K for the visible spectrum surface ). I got lost in detail mid-talk but I rallied at the end ( mid-afternoon brain fade ) to get the gist. Their models are eminently testable if anyone cares to examine either data sets obtained to date ( gravitational lensing of distant galaxies ) and/or produce a study with their particular models ( a few variants ) in mind ie. do they work or not ?
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
OK. Day One Summary. -
)
OK. Day One Summary.
- there are a heck of a crew of very talented people in this area. Their degree of fascination ( a laymen might impolitely say obsession ) with the topics is Epic Legend IMHO.
- I was pleasantly stunned and/or flummoxed by the sheer intellectual density of the ideas on offer.
- after a couple of quick/quiet queries I established that most participants did not fully ( sometimes not even vaguely ) understand the others. For some strange reason I expected that the participants would have a very high level of common 'au-fait'-ness. Silly expectation really. Numpty Mike on that one. I should refer to my own profession where I have literally listened to two or more doctors relate, knowing full well that there a massive cognitive dissonance on display.
Nuff said. See if my neurons survive tomorrow. Like a fading dream I've thrown all this at you while I seem to hold some coherence ( dare I say balance ) on what I've seen and heard.
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
Alas I was afflicted with a
)
Alas I was afflicted with a gastro thingy ( in all fairness probably not due to Monash Uni student cafeteria food, but the temporal association exists ) from mid-Thursday onward and so I missed the banquet and the Friday sessions. So for Thursday morning at least I can report :
Bob Wald's talk on black holes and branes ( higher dimensional analogues of holes ). I followed the initial thermodynamic stuff. Basically as Boltzmann indicated : almost always entropy increases in isolated systems. Thermodynamic temperature can be defined as the inverse of the rate of change of entropy with energy. Now I had a brief deja-vu moment as his initial logic was eerily familiar to that of Max Planck ( deriving quantum behaviour from black-body radiation data circa 1900 ). In any case he linked the dynamic stability of black holes ie. will they persist as such, in the preconditions he specifies, to thermal behaviour. This is not an unusual conclusion to draw really : if a bomb explodes then it is both dynamically and thermodynamically unstable ! Ditto for implosion.
Here's a thought. If you let a blob of water radiate ( nett ) heat away then what happens to the temperature ? It cools of course in the sense of the average kinetic energy of the molecules reduces. Blow on hot soup to cool it etc. If you put energy in eg. microwave your coffee, it will heat up. This is known as positive heat capacity in that heat input and temperature track together.
If a star radiates energy away what happens ? Think carefully now. It will shrink ( eventually, think long time scale ) and the centre will get hotter. So energy loss gives a temperature rise. This is phrased as a negative heat capacity. No conservation law is breached because the Sun is utilising nuclear ( potential ) energies via fusion to do this, which is a mechanism not available to the water in your soup/coffee.
This is very much the key issue. For stars gravity always attracts and the tapping of fusion energies only forestalls the inevitable collapse ( of some version ). What Mr Wald has done is generalise known black hole physics in this regard to more than the currently known 3 + 1 dimensions. Not an stunning deduction but I guess there would be concern if found otherwise.
In a more general scenario of Mr Wald's ( and others ) eternal inflation pretty well makes nonsense of what we might mean by 'system', 'system boundary' and 'almost always' in thermodynamic discussion. Recall that thermodynamics arose from the study of steam engines etc. Can we throw/propel that logic from Victorian engineering to super-spatial entities ? ;-)
More later.
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
Dr Bell's topic of The
)
Dr Bell's topic of The Particle Physics of Dark Matter, and Beyond :
Quite interesting. Some years ago Andrei Sakharov came up with a set of general conditions that must be satisfied ( at some time in the past ) in order for there to be an excess of matter vs anti-matter in the universe. Currently we measure or produce anti-matter only very very occasionally within our visible horizon and so an explanation for that was sought. In short the Sakharov conditions dictate ( temporary ) breaches of symmetries and an interval of time when when the universe was not in equilibrium.
Dr Bell's hypothesis is that the Sakharov conditions might have applied to the production of an as yet unknown class of particles in the category known as Weakly Interacting Massive Particles ( WIMPs ). Weakly interacting here means that they interact via the weak nuclear force. Massive means two things really (a) they have a rest mass at all and (b) they are rather heavy in comparison with 'ordinary' matter. Some particle(s) about five times the mass of a proton/neutron would fit the slot nicely ( an estimate based on known cosmological parameters ). The factor of five comes in as the dark to normal matter ratio is approx 25% to 5% ....
Think of it this way. Suppose I have a machine that creates pies and anti-pies ( I like steak and kidney, but cream & custard is fine too ). If a pie and an anti-pie meet they annihilate and you get a taste explosion. What is more is that we can make (anti-)pies that can only be eaten when held in the left hand, and likewise (anti-)pies eaten with the right. Normally if I make a right handed pie then also a left handed anti-pie comes out too ie. pie-ness and handed-ness are conserved.
However each pie or anti-pie can also 'decay' into two or more other smaller/lesser pies of different varieties. So a single cream&custard pie might become a cream pie and a custard pie. Likewise an anti-steak&kidney pie may decay to an anti-steak and an anti-kidney pie.
So imagine if my pie/anti-pie machine malfunctioned. For a while it makes more anti-pies than pies. It shouldn't but it does, and by doing so breaks one of the Rules Of Pies that says the number of pies minus the number of anti-pies should be zero. Furthermore we find that the rate of pies decaying into smaller pies is slower say than the rate of anti-pies decaying into smaller anti-pies. Throw in that the weak force is asymmetric in terms of mirror symmetry anyway then one could have, say, more left handed steak pies than right handed anti-steak pies. So far what would happen when we turn the pie machine off is that things would in time settle out if left alone.
Now we throw in the idea of ( thermodynamic ) equilibrium. Suppose we look at a cream&custard pie breaking up into a cream pie and a custard pie. Could that operation go the other way, that is if I throw a cream pie at a custard pie then might a combined cream&custard pie re-form ? Yes it could. But there is a special problem here. When I break a combined pie into two separate ones there will be a point in space and time ( an event vertex ) for that. That vertex could be anywhere/when. However if I start with a custard pie and a cream pie, wanting them to collide and form a composite, then it must be 'arranged' that they meet at all. It is not enough to have each small pie in the same universe, they must be sufficiently nearby in order to react at all.
The weak nuclear force is very local. Very early on in the universe things were close together and thus going back and forth across reactions was no trouble because of proximity. The universe expanded but without the range of the weak nuclear force increasing and so in relative terms an opportunity arose for reactions to easily go one way - composite (anti-)pies breaking down - but rather less opportunity for the reverse reaction - small pies forming bigger ones. So this is one way of stating the 'out of equilibrium' part of the Sakharov conditions. If you like : there are vastly more ways of two particles to be separated in 3D space than there are ways for them to be in the same place. [ That's pretty well the idea of increasing entropy in a nutshell BTW ].
[ side note : matter and anti-matter [strike]particles[/strike] pies only annihilate if of exactly the same type. So an anti-cream pie does not produce a taste explosion with a steak pie. Anti-cream & cream annihilate. Steak and anti-steak annihilate. ]
Back to our universe at present day. What might we see if any of the above is relevant ? The first has been mentioned : dark matter WIMPs of around the same order of mass magnitude of normal known nucleons. Assuming a WIMP halo distribution around our galaxy then as we speak we ought be drifting through said cloud to some degree. Can we detect a WIMP 'wind' in the sense of a rhythmic variation in certain reaction types on a diurnal and annual basis ? We should hit more WIMPS if travelling against the wind vs going along with it. Preliminary results from a number of experiments sensitive to this aspect have indicated YES, in that while sufficient ( five sigma ) significance has not yet been reached there is an increasing significance the longer these studies have run. One slide showed around three-ish sigma. Watch that space ! :-)
Cheers, Mike.
( edit ) 'Sigma' significance in this context is a measure of how likely it is that random signal fluctuations might fool you into thinking it has a pattern. The higher the sigma the less likely the fooling. One is never certain really but it is reasonable to produce a statistical measure which we may interpret as 'reliability of true signal'. For example some of our pulsar detections have come in ( after careful analysis of error ) at some 40 sigma. The sigma scale is not linear. For gravitational wave detection claims the LIGO preferred level is some 20 sigma : an incredibly strict/harsh standard of physical truth.
( edit ) Side issue. The force interaction strengths which are crucial in terms of relative reaction rates when out of equilibrium also depend on energy scale. Or if you like are temperature dependent. So the 'fine structure constant' (an historical name ) sets the overall magnitude of an electromagnetic interaction ie. b/w any charged particle and a photon. The everyday value is around 1/137 but there is a trend - as disclosed over decades of particle collisions - for this to increase as interaction energies ( = temperature ) goes up. You may have heard that the various non-gravitational forces 'merge' or 'unify' at high temperature. What this means that if we extrapolate known low energy behaviour ( the best we can produce here on Earth ) to enormously higher energy then some optimistic curve fitting predicts that the interaction strength for said forces become on par if not exactly equal ( much debate ). So a matter particle at such extra-ordinary energy will be rather indifferent, as it were, to whether a given event is mediated by photons ( EM ), vector bosons ( weak nuclear force ) or gluons ( strong nuclear force ). That is 'weak' and 'strong' for instance are our low energy labels. A typical phrase used to describe the force strengths diverging from one another when the temperature goes down is 'symmetry breaking'. That is a way of saying they were ( much ) the same and now they are ( rather ) different. As a semantic point it is possible that symmetry is possibly an over-used word I think.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
A Tour of Quantum
)
A Tour of Quantum Entanglement
I had forgotten to comment upon the topic "Interpreting the Mathisson-Papapetrou equations" by a Mr Norton, the description of which is somewhat dense. It amounts to an attempt to explain the basis of quantum mechanics with a more classical approach, at least for the electron. It is a variant on a model that Louis de Broglie used nearly 90 years ago. "Zitterbewegung" is a German word for "trembling motion". "Pilot wave" is a term of similar vintage used to describe a field guiding the movement of particles. These days a particle is the field or a lump thereof and so particle and field are not separate ie. quantum field theory. Specifically the thrust of Mr Norton's presentation/arguments is to establish quantum mechanics as emerging from a more 'hidden' underlying particle theme. Hmmmmm ..... one might say. Now as presented his ideas are not complete enough, as readily admitted by the author ( I think he said "not obviously wrong" while smiling ), so we'll leave the specifics of the talk be. Don't be harsh on the presenter here. QM does have its problems and so whatever might be found to resolve them is going to come from an off-beat direction. All the fairly obvious approaches have been tried.
It does revisit a general idea though : might all this QM stuff, nonsense in a classical sense with its anti-intuitive screwy logic and probabilistic character, really be an as yet-to-be-deduced-but-sensible classical scheme ? Recall that most of the original architects of quantum mechanics were frequently unhappy with their production. Not because it didn't work. It worked famously well and has since moved from strength to strength. But because it didn't gel well with their 'natural' intuition.
Now whatever underlying particle model that could be created has to cope with a very important scenario : quantum entanglement. There is now considerable experimental demonstration of this pioneered by Alain Aspect ( French physicist ) working from analysis by John Bell ( Irish physicist ). This is a very special behaviour that currently only exists in QM with no classical analog. As it takes some discussion to prepare the discussion then please bear with me. Plus the usual dragon caveat applies. :-))
How do you define a particle ? A local entity is the typical answer with well established specific 'intrinsic' features like ( rest ) mass, spin, charge, magnetic moment, other force quantum numbers etc ( say 'quarkiness' ) depending. This particle model is the basis for much success where, say, forces are viewed as particle exchanges. Each interaction vertex is a place and time, thus amenable to relativistic treatment. Any and all conservation laws, quantum transitions etc must be resolved in a per-vertex fashion. We build up a higher level description by creating a web of cause-to-effect nodes/vertices ( quantum electrodynamics and beyond ). Orderly. Explicable. Particularly : the only influences that transmit cause-to-effect are the travel of particles b/w spacetime events. Notice the not-random adjacency of the words 'particle' and 'particular' here. :-)
Probability comes in as a quantitative descriptor derived from complex numbers known as amplitudes. Indeed John von Neumann ( Hungarian born pure genius mathematician ) demonstrated early in his career the formal correspondence b/w the complex number ( Argand ) plane and QM. Which is why if you are doing QM in any more than superficial detail, you have to understand complex numbers thoroughly. You might think that 'quantitative' is an odd way to talk of probability but it is indeed quite appropriate. For a large number of identical/likewise prepared situations the aggregate numbers produced follow the simple rule :
... for whatever physical quantity is being measured. As per classical probability theory. The quantum rider is : often a measurement can only be one of a discrete set, somewhat like one energy level from a ladder of choices that has gaps between the rungs. A discrete spectrum with some/most values forbidden. So for non-pure initial states - a mix of several pure spectral states - the weighted mean ( called the 'expectation' ) is not going to be any specific spectral amount. Over very many trials the expectation is going to usually lie between one quantum level and the next eg.
weighted mean of [(five lots of 3.0) with (four lots of 2.5)] = [15.0 + 10]/(five lots plus four lots) = 25/9 ~ 2.78
.. which is neither 3.0 nor 2.5. There is no single measurement with the value 2.78, each is either 2.5 or 3.0 in this example. So here lies the rub. You can expect whatever QM calculates but a single measurement will ( almost ) never reflect that. Expectation, averages, probabilities as calculated or recorded numbers only approach each other as the count of repetitions goes to infinity as a limit. The reason why classical looks 'classical' is that for systems with lots of things ( repeated in time, space or instance ) that limiting process has effectively occurred. Thus we never sensed the quantum probability aspect until, historically, we could sense small numbers of things and deduce its per-case uncertainties.
Which brings us of course to Mr Heisenberg and his famous principle, which has two related aspects :
- for a single small entity under measurement you have to upset it and hence change the future course of events because you 'looked'. The harder you look the more you change the future of that particle. You can determine a particle's energy, say, to any degree of accuracy you prefer but limited by your exhaustion in waiting for the answer. Or you can localise a particle to some place arbitrarily exactly but thereafter have no idea where it went.
- for groups of particles ie. repetitions of instance, the group statistics follow the inverse character too. So if I get a narrower variance ( the square of the standard deviation ) b/w measurements of one type there will be a corresponding wider variance of what is called the conjugate variable.
These distinctions between expectation and probability, between individual examples and group trends is crucial for the entanglement discussion. Thus endeth the preamble.
Albert Einstein, Boris Podolsky and Nathan Rosen (EPR) theoretically analysed quantum scenarios with a view to finding flaw. Which they apparently did, but it was later realised they found something very deep and true about QM. In software terms you could say what was first thought of as a 'bug' was actually a 'feature'. :-)
Take an interaction vertex, for discussion we will use an electron colliding with a positron, but there is generality represented here. You get two photons of exactly the same energy firing off in exactly opposite directions ( at least if considered in what is called the centre-of-mass reference frame ). Each photon has a property called spin that has a 'polarity' behaviour. If I put a 'polarity analyser' in its way I can establish whether it is 'up' or 'down'. The technical detail is not important here, the fact that I get one of only two possible answers from measurement is important.
But to be necessarily excruciating my polarity analyser has a third result : nothing is recorded. If you like non-measurement is a possible measurement outcome. Think carefully now. How do I distinguish as an experimenter b/w these two processes, say during some five second interval :
- a photon went by but my machine did not respond.
- no photon went by.
... as the machine does not respond in either case. So the answer by logical positivism is : you can't, do not and won't. :-)
Now for our two fleeing photons ( like bank robbers splitting up and departing the crime scene ) then to conserve spin one must have the opposite of the other's. If one is 'up' the other is 'down', where the words 'up' and 'down' however refer to a specific axis of measurement ( perpendicular to the photon path ) determined by measuring devices. Observation reveals the opposite spin sense for each in a pair.
Big deal you say. No mystery at all. At the annihilation vertex a decision was made as to whom would be 'up' and who would be 'down'. And randomly so in that over many measurements ( NB repetition ) along any given line passing through the vertex roughly equal numbers come out 'up' as 'down' provided that you watch for long enough. So one could think that our robbers tossed a coin at the bank to decide who would be what spin. We just found out later. This explanation is called a 'hidden variable' formulation, Some yet to be specified mechanism locally determined later separated measurements.
Now EPR said that could not be the case as per the quantum formalism of the time. The production of those two photons is a special case in the QM maths : only when one photon is measured is the spin determined and thus only then can the other's be too. The mathematical expressions cannot be factored to represent each photon individually. There is an expression that necessarily has variables for both photons locked in it. This hit the theorists like a train. Bohr was frantic. Pauli was highly annoyed. Dirac was befuddled. But no fault in the EPR logic could be found. IF one accepted QM's formalism THEN this entanglement must happen also.
For those that measure these photons ( produced by this annihilation route ) it means that the choice of polarimeter axis at one end determines the possible response of the other polarimeter to the other photon ! What is even more amazing is that one can setup the scenario to choose the polarimeter measurement axes after the photons have left the vertex ( after the robbers left the bank ). Delicate timing indicates - to a high degree of confidence - that any transit of the effect of axis choice at one end must propagate ( if such a word is the right one to use, see later discussion ) many times faster than light speed. I've seen one group quote 20 times light speed as the minimum rate per their degree of timing. All these behaviours are deduced as occurring long after all the shouting dies down, as it were. The 'late' choice ( mid photon flight ) of polarimeter axis can even be determined stochastically : by some other apparently unrelated method like the temperature fluctuations of a cup of tea. Only in retrospect can one say 'the speed of the effect is ... ".
Experimentally one cannot say that a given specific photon pair is 'cheating' at the time, so to speak, via entanglement ( the issue is partly the ambiguity of polarimeter non response as above ). So no faster-than-light signal method is available here. The analysis is always post-hoc and only group statistics divulge the entanglement aspect. A fairly simply derived ( Bell's ) inequality gives the limit on group behaviours on the assumption of hidden variables ( those invisible gears yet to be discovered ). In one setup the experimental result versus Bell prediction means that 1/4 must be greater than 1/3 - an obvious problem.
Suck it up. Entanglement happens. Particles under QM behave non-locally. Two photons may behave as one ( ie. without intervening spacetime placing a communication speed limit between them ).
Digest this if you have nought else about. More after Christmas.
Cheers, Mike.
( edit ) Zitterbewegung as described by Mr Norton brings to my mind the word "whirlygig". The electron ( a point-like entity with dimension zero ) follows a very tight constant radius helix at a given momentum. The axis of the helix being what we would label as 'the path of the electron' and it's momentary centre of orbit being 'the position of the electron'. This readily yields a 'phase' aspect needed in QM ie. at what part of the orbital arc is the electron at ? This opens up the possibility of phase differences & contrasts etc needed for group behaviours like diffraction and interference.
Also if you sit at one polarisation analyser looking back toward the annihilation vertex and on towards the other anaylser, then a rotation angle can be defined showing how the 'up' axes of the polarimeters compare. This mutual angle can vary smoothly from zero right around to a full circle. There are hidden variable models aplenty that agree with experiment for aligned ( zero degrees ), crossed ( 90 degrees ) and anti-parallel ( 180 degrees ) mutual angles. The entanglement proof comes from multiple measurements at intermediate angles. To be correct a certain number must be both :
- no greater than 0.25, and also
- no less than 0.35
If you like this is the 'paradoxical' EPR. Which is the experimental signal that hidden variables can't cut it. I reiterate : QM does not force any photon to respond in a specific way to the measurement. Probability still rules. What QM implies is a special relationship between the two photon measurements as described, such that in effect their spin senses are instantly determined despite any amount of separation. For all we know, two such photons could travel light years from an matter/anti-matter event and still bear this character .....
Maybe this is how Santa can do so much on a single night ? :-0
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
OK, quick replay : we have
)
OK, quick replay : we have two particles emerging from a common interaction/event vertex, and by satisfying some conservation law they must have some correlated properties. It was spin angular momentum in the last post but could have been momentum or energy for example. Knowledge of the value for one particle allows deduction of the other. However the choice of measurement mode for one particle restricts measurement options for the other : regardless of their separation and light travel time b/w the two instruments. As far as we can measure to date this is effectively instantaneous or more precisely has no yet known upper limit on this 'entanglement' effect. Even more we can only deduce that aspect by very many measurements and thus it becomes a statistical comment on replicated cases. The rubber meets the road here by absolute inequalities that cannot and do not allow overlap, requiring sufficiently good technique ( basically best isolation of the system under study and high quality detectors ) to lay that bare eg. 1/4 is not greater than 1/3. Not being an individual particle comment rules out any faster than light signalling system. Both cruel and clever for Nature to be that way. See Catch-22 ..... :-)
Now clever punters might note that while we have phrased the situation as entangled particles, might it be equivalent to say entangled measuring equipment? This idea is born from the fact that if not for the equipment then we would have no understanding at all. What is thus being super-luminally 'transmitted' as it were, is the allowable measurable margin(s) of some quantity. As mentioned this is only for conjugate quantities that would otherwise be related by a Heisenberg inequality eg.
delta_px * delta_x > [some constant_expression_involving_Planck's_constant]
.... here x and px are the position and momentum along the same line axis, the delta's indicating the population variance for very many examples. The upshot here is that if, say, a momentum measuring device is so good as to really narrow down the value over repeated instances of measurement THEN the position measuring device is going to be inversely poorer at it's variance for those same repeated instances*. So I could have also written :
delta_x > [some constant_expression_involving_Planck's_constant] / delta_px
or if you like :
delta_px > [some constant_expression_involving_Planck's_constant] / delta_x
All forms are both equivalent and allowable as none of the terms are ever zero, so specifically they can be the denominator. Can it be the case then that the entanglement 'conspiracy' is a function of all players : the particles, the gadgets and for that matter the humans running the tests ?
The last is the easiest to rule out. Just automate the apparatus. Go fishing for a weekend or three and read the summary on return. I guess it would be an awfully long, long throw to connect experimental results to hobbies. It might be true of course. But how far do you want your paranoia to go ? I say paranoia because if one reads the literature or discussion on this - amongst the relevant practitioners - the topic is somewhat anthropomorphic. So you might read statements which imbue intent from non-organic things. This is not literally meant. What is being done is design of tests to exclude criticism of outcomes which might be apparently 'subconsciously manipulated'. I know that sounds really odd when I put it like that but I suppose that is the level of difficulty of the topic. Nuff said ...
That leaves the particle/gadget duo then. This is a more natural thought. What is a measuring device, in this context, but a way of magnifying quantum scale events to human scale evidence ? That can be seen objectively as a sequence of event multiplications. Suppose a photon hits a semiconductor crystal in a photodetector. That promotes one or a few electrons to a higher energy state in the lattice. Because of the way the lattice was prepared ( manufacture with controlled impurity fractions, bias voltages, high & low resistance pathways etc ) an avalanche is triggered. You did not directly notice the very last snowflake that fell on the mountain side, but by heck you can see tonnes of snow coming your way ! That sort of thing.
So the purest way to consider an entanglement scenario is the original particle production vertex plus two other distinct spacetime separated events that we deem as measurement instances. In the parlance of special relativity some of those events have a 'spacelike interval' b/w them. A 'timelike interval' is where one event lies in the light cone of the other and so we allow a paradigm labelled as cause and effect. Hence a spacelike interval means that cause-and-effect does not apply. The word acausal meaning 'without cause' is also used.
I have an example in mind to share more with you. I need to make a few diagrams though.
Cheers, Mike.
* I explain here to such excruciation out of necessity. You may not have met the full horror of Heisenberg's thinking/analysis beyond glib summary.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
Alas no simpler example
)
Alas no simpler example suffices. That is the way of it. Hence the conundrum.
Try this visualisation : a flat piece of paper. The surface of which represents a physical 2D space in which we will discuss the matter :
It's linear units are of length :
In the centre is the vertex that produces an entangled photon pair :
that hence move away each along a line from that point. Conservation laws dictate that those trajectories must be in opposite directions :
Let the ( otherwise identical ) photons for example be detected some distance ( not necessarily equal ) from the vertex that generated them :
At the vertex make a linear fold in the paper perpendicular to the common line upon which the photons have traveled.
We will fold the space upon itself and in doing so the trajectories will come to lie next to each other along their entire length. If the paper was translucent/transparent then after folding you could clearly see that ( sorry about the strobe from the fluorescent tube light box ).
If we started a clock at the instant of particle production then each photon along each of it's lines would match the other distance for distance from that production vertex. In other words in the folded format there would be no intervening 'distance' between as they move. So an event ( eg. what we denote as 'detection' ) that determines whatever feature that we deem as entangled ( spin, polarity, whatever ) is resolved in a singular fashion ( here enacted by stabbing the paper with a needle )
Hmmmmmm. So the cheat here is that I have got around the 'spooky action at a distance' issue by a geometric trick.
Now this is just a spot of origami. Not a theory. Not a breakthrough. Nothing to generalise without contradiction. Nothing to get excited about. Basically a visual trick to remember/highlight the key sense of the problem :
IF you want to keep cause & effect as a reliable descriptor of reality when we model it ( with light speed the maximum rate of transfer of that ) THEN entanglement implies that we must invoke other dimension(s).
In this case the other dimension - so we are now 3D - is the one I folded the paper within. Well you could perhaps squirrel out of the concept by saying that you are identifying ( decreeing to be equal ) the corresponding points on the paper, but that is logically equivalent to the fold as described. Said extra 'dimensions' need not be either length or time or even have the character of any currently known physical unit type. It just has to serve to give the appearance in our reality of the entanglement behaviour.
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
This explanation of
)
This explanation of entanglement requires that there be at least one extra dimension. Oh dear! I perceive a slippery slope leading to strings.
Richard
Richard
RE: This explanation of
)
Aarrghh ! Yes we don't want to do that ! We'll wind up mud wrestling at the bottom. :-)
Fortunately said extra dimension(s) don't have to be 'physical' like we are used to. Dimension can be another way of saying 'an extra degree of freedom'. A good example is quantum mechanical phase, which is just a number without known physical units. Or if you like it is measured in it's own unique units, but with the cyclic property of modulo arithmetic. So to visualise geometrically phase is a dimension closed in upon itself, a wee circle*. You can't measure any QM phase value directly but phase differences may be disclosed by suitable apparatus.
The interesting stuff comes from when different dimension types are related. Speed is a good example. It is distance divided by time. For light that is a constant regardless and so that fixed ratio determines much of the characteristics of our experience of this universe. A speed value is input into kinetic energy and thus enters a set of calculable quantities that as a group ( in some physical scenario ) preserve their total amount ie. conservation of energy.
{ Of course you can link distance, time and phase all in the one. This is the bargain basement as it were, because all of the QM relations blossom from this once you include SR with mass/energy equivalence. }
It has been shown that Beryllium ions show correlation/entanglement. This is significant as one has negligible chance of non-detection by instruments here ie. a Beryllium ion will plow into and be stopped by a lattice no problem. Not like a photon polarimeter say where a non-trivial fraction will sail right through without a blip. So that really narrows down the statistics and hence the gap b/w measurement and prediction via Bell's Inequality ( that assumes locality or hidden variables ) is wide. I think it was the NIST people in Boulder, Colorado that did that in the last decade.
In explaining this I think we are deep down stuffed. We are within the thing we try to describe and can't pop out for an 'external' view.
Cheers, Mike.
* A neat twist ( pardon the pun ) on this a type of dimension attribution to quantum properties is spin. A spin one-half particle has to be rotated twice to return to original phase. If you take neutrons and handle them via a magnetic field then this can be demonstrated by the resulting interference of rotated vs. unrotated particles { to be exact the 'spin' and 'rotate' words here don't have the easier classical meaning }. Anyway a Fermion's phase loop has more the characteristics of a Mobius strip than a simple circle.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal