Gamma-ray pulsar binary search #1 on GPUs

Mad_Max
Mad_Max
Joined: 2 Jan 10
Posts: 154
Credit: 2213404725
RAC: 386222

Did we run out of work or

Did we run out of work or something broken?

Last day i am getting all the time:

24/02/2019 03:30:54 | Einstein@Home | Requesting new tasks for AMD/ATI GPU

24/02/2019 03:30:55 | Einstein@Home | Scheduler request completed: got 0 new tasks

24/02/2019 03:30:55 | Einstein@Home | No work sent

24/02/2019 03:30:55 | Einstein@Home | No work is available for Binary Radio Pulsar Search (Arecibo, GPU)

24/02/2019 03:30:55 | Einstein@Home | No work is available for Gamma-ray pulsar binary search #1 on GPUs

24/02/2019 03:30:55 | Einstein@Home | No work is available for Gravitational Wave All-sky search on LIGO O1 Open Data

But status page shows

GRPB1G search progress

Total needed Already done Work still remaining
100.00 % 99.19 % 0.81 %
54,708,172 units 54,265,653 units 442,519 units
805.1 days 795.6 days 9.5 days (estimated)

While Tasks to send ~ 0

Richie
Richie
Joined: 7 Mar 14
Posts: 656
Credit: 1702989778
RAC: 0

New tasks for FGRBP1G became

New tasks for FGRBP1G became available just a moment ago. Try clicking update...

gch
gch
Joined: 19 Jul 13
Posts: 2
Credit: 241741996
RAC: 71189

Yes, there are working units

Yes, there are working units available again.

Regarding the observation of MAD_MAX: from what I understand, the project has an overall amount of working units available, which is shown in 'work still remaining'. These are fed into the queue into portions, which is reflected by 'Tasks to send'.

Mad_Max
Mad_Max
Joined: 2 Jan 10
Posts: 154
Credit: 2213404725
RAC: 386222

Yeah, flow of new GPU tasks

Yeah, flow of new GPU tasks has resumed right after i wrote previous post.
Now work queue is full again.

But anyway: looks like GRPB1G is ending very soon. Is there a new data set for GR coming or we will switch GPUs to another sub-project? Like resuming of BRP4G or FGRP5G for example?

GW on GPU will not be ready yet i guess...

DanNeely
DanNeely
Joined: 4 Sep 05
Posts: 1364
Credit: 3562358667
RAC: 0

Mad_Max wrote:Yeah, flow of

Mad_Max wrote:

Yeah, flow of new GPU tasks has resumed right after i wrote previous post.
Now work queue is full again.

But anyway: looks like GRPB1G is ending very soon. Is there a new data set for GR coming or we will switch GPUs to another sub-project? Like resuming of BRP4G or FGRP5G for example?

GW on GPU will not be ready yet i guess...

 

It's business as normal on the GPU side.  The total amount of work gets incremented every week or so when a new data file is imported into the system; with the result that it always looks like we're about to run out in the near future.  Actual availability is a black box on the user end because we have zero insight into the amount of data ready to be fed into the system; however my understanding is that Fermi is generating a continuous stream of new data to be looked over so we shouldn't be in danger of a long term outage.

Mad_Max
Mad_Max
Joined: 2 Jan 10
Posts: 154
Credit: 2213404725
RAC: 386222

The task queue is empty again

The task queue is empty again for about a day.
Tasks to send = 0 and FGRPB1G work generator - Not Running

Gavin
Gavin
Joined: 21 Sep 10
Posts: 191
Credit: 40644337738
RAC: 1

This issue has already been

This issue has already been raised in more appropriate parts of the forum.

Occasionally things happen that are beyond anybodys control (usually at the weekend whan staff levels are short or non existant) we are all entitled to at least one day off after all!

Whatever the current problem, it will be sorted tomorrow and if not an annoucement will be made as to the cause of the issue and anticipated time of fix.

We are all in the same boat so you are not losing out ;-)

Shadak
Shadak
Joined: 3 Oct 09
Posts: 20
Credit: 20966427
RAC: 0

there seems to be a problem.

there seems to be a problem. I got only erroring WUs today. till yesterdays everything worked fine

computer and his jobs

 for the last failing job i captured the errormessage:

29.04.2019 09:48:19 | Einstein@Home | Starting task LATeah1049U_172.0_0_0.0_18544470_0
29.04.2019 09:48:23 | Einstein@Home | Computation for task LATeah1049U_172.0_0_0.0_18544470_0 finished
29.04.2019 09:48:23 | Einstein@Home | Output file LATeah1049U_172.0_0_0.0_18544470_0_0 for task LATeah1049U_172.0_0_0.0_18544470_0 absent
29.04.2019 09:48:23 | Einstein@Home | Output file LATeah1049U_172.0_0_0.0_18544470_0_1 for task LATeah1049U_172.0_0_0.0_18544470_0 absent

Bernd Machenschalk
Bernd Machenschalk
Moderator
Administrator
Joined: 15 Oct 04
Posts: 4312
Credit: 250507337
RAC: 34395

Sorry. The app version 1.21

Sorry. The app version 1.21 was meant to fix the "thread priority" issue as described here, but apparently has an other problem. Deprecated for now.

BM

Shadak
Shadak
Joined: 3 Oct 09
Posts: 20
Credit: 20966427
RAC: 0

nice try ^^

nice try ^^

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.