Upload problems?

pieface
pieface
Joined: 3 May 05
Posts: 2
Credit: 15,304,612
RAC: 0
Topic 192297

Anyone else having problems? Looks like I have been since maybe 10:30pm last nite, uploading, but not 'reporting' maybe because the scheduler is off-line?

ca_grufti
ca_grufti
Joined: 9 Feb 05
Posts: 53
Credit: 4,309,237
RAC: 0

Upload problems?

It seems that there is lots going on. This E@H run seems to be limping towards the finish line, but it's still making decent progress of a little more than 1% of WU per day. The end is near.

pieface
pieface
Joined: 3 May 05
Posts: 2
Credit: 15,304,612
RAC: 0

Seems to have fixed itself

Seems to have fixed itself right after I started this string. Maybe someone kick-started it again!

Annika
Annika
Joined: 8 Aug 06
Posts: 720
Credit: 494,410
RAC: 0

Well I have noticed scheduler

Well I have noticed scheduler probs, too, but they never lasted more than a few hours... I'm glad I got myself a somewhat larger cache again, though.

DanNeely
DanNeely
Joined: 4 Sep 05
Posts: 1,341
Credit: 1,947,317,114
RAC: 1,614,094

Part of the problem is

Part of the problem is probably that the current all short WU mix is hammering the server ~9x harder than the mostly long WUs were earlier in the run.

Udo
Udo
Joined: 19 May 05
Posts: 203
Credit: 8,945,570
RAC: 0

RE: Part of the problem is

Message 59175 in response to message 59174

Quote:
Part of the problem is probably that the current all short WU mix is hammering the server ~9x harder than the mostly long WUs were earlier in the run.

I also think so!
On the 'Server Status page' you can see:
'oldest unsent result': 0 d 0 h 0 m
'BOINC scheduler': Not running

The scheduler is unable to create WUs fast enough...

Udo

Gary Roberts
Gary Roberts
Moderator
Joined: 9 Feb 05
Posts: 5,405
Credit: 53,434,701,920
RAC: 73,352,468

RE: 'BOINC scheduler': Not

Message 59176 in response to message 59175

Quote:


'BOINC scheduler': Not running

The scheduler is unable to create WUs fast enough...

The scheduler doesn't create the work - it just dishes it out :).

The work generator process is running so work is being created, it would seem. All the download mirror sites are running. My boxes seem to be getting work as required without too much difficulty right now. However, things could change at a moment's notice!!

Cheers,
Gary.

Tobie
Tobie
Joined: 4 Sep 06
Posts: 6
Credit: 79,955
RAC: 0

Does the WUs that have been

Does the WUs that have been sent out, but are not being crunched because the machines are in 1 week coma (some machines have 300+ waiting in the cache) have an effect on the servers/database?

Martin Johnson
Martin Johnson
Joined: 20 Feb 05
Posts: 6
Credit: 133,188
RAC: 0

My machine went into a 7 day

My machine went into a 7 day coma, but as soon as I saw that the project was running again, I pressed the Update button and it came out of the coma.

roadrunner_gs
roadrunner_gs
Joined: 7 Mar 06
Posts: 94
Credit: 3,369,656
RAC: 0

RE: My machine went into a

Message 59179 in response to message 59178

Quote:
My machine went into a 7 day coma, but as soon as I saw that the project was running again, I pressed the Update button and it came out of the coma.

Install boinc 5.8 and you haven't got the problem with one week defered scheduler anymore.

Annika
Annika
Joined: 8 Aug 06
Posts: 720
Credit: 494,410
RAC: 0

Not worth the bother in my

Not worth the bother in my case, as I have both my boxes right here with me and have a look at BOINC from time to time anyway, so I can do just what Martin said. I don't want to trade in an app that has been running stable for months for one with a feature I don't really need (yeah, I also do server administration ^^ maybe that influences the way I run my private boxes aswell).
But, Martin, you should consider that quite a few people have five and more boxes attached, or use hosts they can only administer at certain times or via remote. In that case, the new feature must be a blessing.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.