Will there be a S6?

Tom Philippart
Tom Philippart
Joined: 1 Oct 06
Posts: 11
Credit: 11,670,137
RAC: 0
Topic 192214

the title says it all.

Desti
Desti
Joined: 20 Aug 05
Posts: 117
Credit: 23,762,214
RAC: 0

Will there be a S6?

Quote:
the title says it all.

Yes!
Take a look into the Detector Watch Thread.

Tom Philippart
Tom Philippart
Joined: 1 Oct 06
Posts: 11
Credit: 11,670,137
RAC: 0

thanks Will it only start

thanks

Will it only start by the end of 2008 or beginning 2009? What will Einstein@Home be doing till then?

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6,579
Credit: 307,335,674
RAC: 169,045

RE: Will it only start by

Message 58166 in response to message 58165

Quote:
Will it only start by the end of 2008 or beginning 2009?


~ 2011 +

Quote:
What will Einstein@Home be doing till then?


There will be tons & tons to do in the meantime. By the completion of S5 collection a rather large data set will exist which can then be searched deeper.

As Maria Alessandra Papa ( Bruce Allen's co-worker who develops the algorithms ) said recently at the AEI public talks ( I hope I have transcribed/punctuated/phrased correctly from the audio ):

Quote:
So as the computations are taking place we're actually always continuing to try and improve our algorithms. So that even with the huge computational power that we have with Einstein At Home we can actually do deeper searches which means really extending the reach of the search even further. And in fact I think it's interesting to comment after Brian's slide that pointed to a period when the instruments will be offline. I think we shouldn't worry about Einstein At Home running out of work. This in some respects will give us a chance to catch up with the data. It will be a nice sort of period of time when we can apply our most sensitive algorithm that we will have developed then to the full S5 run data.


The 'Brian's slide', referred to above, is shown earlier in this thread. The pink/purple bit to the far right on the timeline marked as 'Adv LIGO' is S6, which thus starts after 4Q '10 ie. after fourth quarter of 2010. As I think that refers to financial/accounting years, then that makes it mid/late calendar year 2011. If not then it will be start of calendar year 2011.

Bruce also added the following points ( paraphrased from the audio ):
- currently we are working on about 40 - 50 GB of data ( about 1000 hours of instrument operation ).
- in the next six months starting to distribute a much larger data set of about 100 - 120 GB.
- when S5 is over using a complete data set of several hundred GB.
- the work will be structured so that any given host machine will not have to download more than about 20 - 30 MB of data ( as many users have telephone connections in parts of the world )

Cheers, Mike.

( edit ) It's not ~ 2011+ !! That's Adv LIGO. S6 is early 2009 calendar year. Sorry! Read the graphic wrong .... :-(

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Gerry Rough
Gerry Rough
Joined: 1 Mar 05
Posts: 102
Credit: 1,847,066
RAC: 0

RE: Bruce also added the

Message 58167 in response to message 58166

Quote:

Bruce also added the following points ( paraphrased from the audio ):

- currently we are working on about 40 - 50 GB of data (about 1000 hours of instrument operation ).
- in the next six months starting to distribute a much larger data set of about 100 - 120 GB.
- when S5 is over using a complete data set of several hundred GB.
- the work will be structured so that any given host machine will not have to download more than about 20 - 30 MB of data (as many users have telephone connections in parts of the world )

Cheers, Mike

So let me get this right: 40 - 50 GB of data now, 100 - 120 GB later! Yikes!! When S5 is over, is the last data set you mention above a separate set for us to crunch, or just all of the data put together. Second, how long will it take to crunch all of the data you mention? It sounds like we could be talking probably 2012 or even 2013, just from what you mention.

Not that that is a bad thing, by the way, I will just have to buy more computers and get fat from all the credits. Science heaven!! ;-)


(Click for detailed stats)

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6,579
Credit: 307,335,674
RAC: 169,045

RE: So let me get this

Message 58168 in response to message 58167

Quote:
So let me get this right: 40 - 50 GB of data now, 100 - 120 GB later! Yikes!! When S5 is over, is the last data set you mention above a separate set for us to crunch, or just all of the data put together.


The whole lot together I think.

Quote:

Second, how long will it take to crunch all of the data you mention? It sounds like we could be talking probably 2012 or even 2013, just from what you mention.

Not that that is a bad thing, by the way, I will just have to buy more computers and get fat from all the credits. Science heaven!! ;-)


Well, it's an endless trail really. :-)

The method used to search for signals is to designate a template - roughly speaking the desired signal's shape - and 'multiply' it by the recieved signal at various offsets. With longer time intervals the 'random' noise averages out to some level, while the sought-for pattern emerges above it. This is basically what our computers are doing - crunch, crunch, crunch and crunch.....

What ultimately emerges is an estimate of the likelihood that we haven't been fooled by some combination of non-astronomical effects that yield the same template profile. This is qualified as a 'signal to noise ratio'. I think better than 8 to 1 is desired, meaning that it is 8 times more likely to be a real detection than some spurious combo ...

Verification also involves near co-incidence with other IFO's and other sources of knowledge like gamma ray bursts.

The desired signal strength is a fraction of the noise. Normally you may have dealt with situations where the noise is a fraction of the signal. But the noise does not have a preference in direction or sign with respect to what we are seeking. Think of it like trying to detect a loaded pair of dice out of a whole box full of them. If you keep throwing them, then after a while you will note a trend with those rigged ones producing a skewed crop of results compared to the majority.

What I'm ( finally ) getting at is that you can repeat the crunching for a whole raft of different templates, corresponding to various events in the sky, for as long as you like really!!

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Gerry Rough
Gerry Rough
Joined: 1 Mar 05
Posts: 102
Credit: 1,847,066
RAC: 0

RE: The desired signal

Message 58170 in response to message 58168

Quote:

The desired signal strength is a fraction of the noise. Normally you may have dealt with situations where the noise is a fraction of the signal. But the noise does not have a preference in direction or sign with respect to what we are seeking. Think of it like trying to detect a loaded pair of dice out of a whole box full of them. If you keep throwing them, then after a while you will note a trend with those rigged ones producing a skewed crop of results compared to the majority.

What I'm ( finally ) getting at is that you can repeat the crunching for a whole raft of different templates, corresponding to various events in the sky, for as long as you like really!!

Cheers, Mike.

Houston we have a [cough]!

Let's assume we have done maybe several crunches of the data we have as you suggest. Since we know what the noise looks like, if we do yet another crunch of the data we would then be able to pick out the not-so-noise elements pretty qickly right? Sort of like having the dice in your example as the stuff we know we can skip over because we already know what those dice are going to give us. In other words we should be able to pick out the skewed dice pretty qickly because we already know the averages of the others; and once the dice start turning out the skewed numbers, those dice are a safe bet to keep rolling without rolling all the rest as well. Is this how we can make things go quicker in our search?


(Click for detailed stats)

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6,579
Credit: 307,335,674
RAC: 169,045

RE: Houston we have a

Message 58171 in response to message 58170

Quote:

Houston we have a [cough]!

Let's assume we have done maybe several crunches of the data we have as you suggest. Since we know what the noise looks like, if we do yet another crunch of the data we would then be able to pick out the not-so-noise elements pretty qickly right? Sort of like having the dice in your example as the stuff we know we can skip over because we already know what those dice are going to give us. In other words we should be able to pick out the skewed dice pretty qickly because we already know the averages of the others; and once the dice start turning out the skewed numbers, those dice are a safe bet to keep rolling without rolling all the rest as well. Is this how we can make things go quicker in our search?


Well the dice analogy doesn't really hold up that far...... :-(
In fact, as analogies go, that was a short street! :-)

What I left out ( for shame! ) is that due to the high non-linearity of general relativity equations, they are an actual bugger to solve exactly. Out here in the boondocks, along way from really, really fast changing + strong gravity wells it approximates beautifully to 'flat' space plus some little wiggles. The design of the IFO's ( and our lives ) relies on that.

However in the maelstrom of black hole whacking a black hole ( or neutron stars .. ), or a supernovae cracking off ( -/+ a kick to flick some newly born object along some 'tangent' ) the maths is a lot harder. I think Einstein himself was quoted about the correlation between maths and reality once...

Regular orbits, say a double neutron star system are fine. Inspirals are not bad, a gradually increasing frequency with a 'chirp' at the end. But the actual expected waveform of mergers and bangs are a lot harder. There are groups who specialise in 'numerical relativity' who attempt to find solutions for this type of purpose. So there is a blizzard of possible template types. Pick your ( approximate ) solution to Einstein's equations, make a template(s), throw it into the crunching machinery ( us at E@H ), and hope for the best. Then iterate....

Cheers, Mike.

( edit ) For those who may be feeling uncomfortable about this whole approach, I say .... cheer up!! A lot of this has never been done before! This is cutting/bleeding edge basic physics research, and you [ & your computer(s) ] are intimately in the thick of it! In 1986, say, could any of us have been involved in any capacity in such an enterprise? Why buy a supercomputer when you have us?! :-)

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Chipper Q
Chipper Q
Joined: 20 Feb 05
Posts: 1,540
Credit: 708,571
RAC: 0

RE: I think Einstein

Quote:
I think Einstein himself was quoted about the correlation between maths and reality once...


Yes, he said, 'To the extent math refers to reality, we are not certain; to the extent we are certain, math does not refer to reality.'

But would Einstein have rephrased that, upon learning of a proof for the Poincaré conjecture? Here's is a quote from Science magazine article BREAKTHROUGH OF THE YEAR: The Poincaré Conjecture—Proved

Quote:
Perelman's proof has fundamentally altered two distinct branches of mathematics. First, it solved a problem that for more than a century was the indigestible seed at the core of topology, the mathematical study of abstract shape. Most mathematicians expect that the work will lead to a much broader result, a proof of the geometrization conjecture: essentially, a "periodic table" that brings clarity to the study of three-dimensional spaces, much as Mendeleev's table did for chemistry.

The article goes on to say...

Quote:
"This is the first time that mathematicians have been able to understand the structure of singularities and the development of such a complicated system," said Shing-Tung Yau of Harvard University at a lecture in Beijing this summer. "The methods developed … should shed light on many natural systems, such as the Navier-Stokes equation [of fluid dynamics] and the Einstein equation [of general relativity]."

I'm guessing that it will provide valuable constraints to limit the 'blizzard of possible template types' to those types that are mathematically realistic... :)

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6,579
Credit: 307,335,674
RAC: 169,045

RE: Yes, he said, 'To the

Message 58173 in response to message 58172

Quote:
Yes, he said, 'To the extent math refers to reality, we are not certain; to the extent we are certain, math does not refer to reality.'


That's the one!! Thanks :-)

Quote:
But would Einstein have rephrased that.... and the Einstein equation [of general relativity]."

I didn't know what to think when I read that!! It's one of those simply phrased ideas, sort of 'obvious', but much harder to get mathematical proof level of certainty. Mr Yau is one half of the 'Calabi-Yau' space(s), oft mentioned by string theorists. Many of these spaces are thought to be 'equivalent' when transformed via methods similiar to that Poincare procedure. String theory suffers from having too many possible solutions, and without some other guiding principle or experimental feedback they can't be distinguished. For all their valiant and brilliant efforts not one actual numerical prediction ( or post-diction ) has emerged. Maybe this will help the logjam, after all we live in a 3D ( subset? ) world.

Quote:
I'm guessing that it will provide valuable constraints to limit the 'blizzard of possible template types' to those types that are mathematically realistic... :)


An embarassment of riches!

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

zombie67 [MM]
Joined: 10 Oct 06
Posts: 121
Credit: 456,820,609
RAC: 17,794

After reading all this, I'm

After reading all this, I'm still not clear on the future. We're soon to complete S5, right? Are we soon to also run out of crunching work for E@H? Will there be any crunching down-time for us? And if so, when and for how long?

Reno, NV Team: SETI.USA

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.