aLIGO & TACC

JoeB
JoeB
Joined: 24 Feb 05
Posts: 124
Credit: 90266387
RAC: 30584
Topic 197855

Hi,
I just saw this article about advanced LIGO (ALIGO): http://phys.org/news/2014-12-science-powerhouses-gravitational.html
It seems to say that the data processing for ALIGO will be done by the Stampede supercomputer at TACC. It implied to me that there will not be a role for Einstein@home in doing gravity wave processing.
Is this correct?

Thanks

Joe B

Bikeman (Heinz-Bernd Eggenstein)
Bikeman (Heinz-...
Moderator
Joined: 28 Aug 06
Posts: 3522
Credit: 753688932
RAC: 1179793

aLIGO & TACC

Quote:

Hi,
I just saw this article about advanced LIGO (ALIGO): http://phys.org/news/2014-12-science-powerhouses-gravitational.html
It seems to say that the data processing for ALIGO will be done by the Stampede supercomputer at TACC. It implied to me that there will not be a role for Einstein@home in doing gravity wave processing.
Is this correct?

Thanks

I see absolutely no change coming for E@H related to the XSEDE initiative for aLIGO, for several reasons (these are my personal opinions):

1. almost all of the Einstein@Home servers and the ATLAS cluster (which is used for things like pre- and post-processing E@H data) are assets that are run and paid for by the Max-Planck-Gesellschaft (MPG) in Germany, thru its Albert-Einstein-Institute. There are NSF grants supporting E@H (see footer of the webpage) but the vast majority of hardware and man-power behind E@H's infrastructure is paid for by MPG.

2. I believe that even if you would be forced to make a comparison whether runinng E@H jobs on XSEDE or on donated CPU-time on volunteer PCs is cheaper (from the perspective of the scientists and NSF), the volunteer network would surely win (that's where your donation of hardware and electricity comes in).

So in summary there is absolutely no reason to shift computing loads that are now 'crowd-funded' and managed by efforts of the MPG to XSEDE assets, because that would cost US science budgets more, not less.

When it comes to the big data challenges of this type of science we are doing here, you can never have enough computing resources. Rejecting the vast computing resources that are so generously offered by the volunteer BOINC community would be foolish in many ways (not just economically!), and this will not happen, trust me! Tighter budgets will, if anything, mean more crowd-sourcing and crowd-funding, not less.

Cheers
HB

JoeB
JoeB
Joined: 24 Feb 05
Posts: 124
Credit: 90266387
RAC: 30584

Thanks for the encouraging

Thanks for the encouraging reply.

Joe B

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6591
Credit: 323635036
RAC: 231886

I think another point of note

I think another point of note is : same data, many analyses ie. what is one looking for in a given stretch of IFO record ? This is because the interferometers are basically 'omnidirectional microphones'. This implies that any segment of the time series ( differential arm response ) contains a spacetime strain contribution from literally anywhere and anything provided a source has the capability to generate said strain which LIGO is sufficiently sensitive to.

We at E@H are essentially a wing-of/attachment-to the LIGO 'Continuous Wave Group' and as such have certain target signal types in our sights. That being the persistent repetitive stuff with low frequency derivative over the time intervals of interest*. This does not preclude anyone else or us twiddling/re-inventing the algorithms to 'steer' the focus of discovery.

There will be plenty of glory to go around when the time comes and a phrase like 'too much computing resources' will never mean anything. :-)

Cheers, Mike.

* This essentially equates to spinning neutron stars, but as ever we await surprise findings !

( edit ) There will be Rosetta Stone like objects for which we will have a record from several detection modes eg. radio pulsar with a GW signature. That group will do a tremendous service in calibrating the physcial models used to describe such beasties.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Bernd Machenschalk
Bernd Machenschalk
Moderator
Administrator
Joined: 15 Oct 04
Posts: 4330
Credit: 251390019
RAC: 36735

Mike is right. Sensitivity

Mike is right. Sensitivity for continuous gravitational wave searches is almost exclusively limited by computing power. There are other sources of gravitational waves searching for which requires less computing power, or much more data volume per computing time (using Einstein@Home doesn't make sense for searches where clients would spend more time downloading data than processing it). These searches is what large computing clusters like Atlas and TACC (and a few more like the one at Syracuse) will be used for.

Einstein@Home will not lose its primary purpose in the foreseeable future, and with searches for radio- and gamma-ray pulsars there have been added "secondary" purposes that are scientifically successful in their own right.

BM

BM

Mike Hewson
Mike Hewson
Moderator
Joined: 1 Dec 05
Posts: 6591
Credit: 323635036
RAC: 231886

And those 'secondary

And those 'secondary successes' have expertly validated the processing pipeline(s) that E@H is integral of. By that I mean apart from qualities like polarity patterns ( photon spin 1, graviton spin 2 ) the various wave modes ( radio, gamma, GW ) are subject to much the same algorithmic approach.

Cheers, Mike.

I have made this letter longer than usual because I lack the time to make it shorter ...

... and my other CPU is a Ryzen 5950X :-) Blaise Pascal

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.