7+ Megs....!!!

Collin
Collin
Joined: 4 Dec 05
Posts: 19
Credit: 71274
RAC: 0
Topic 190989

Ok, so what's the difference between the small, under 500kb and the large 7+ meg WU's???

I was rather surprised to see the first one come through a day or so ago, and I had to wonder.

Michael Roycraft
Michael Roycraft
Joined: 10 Mar 05
Posts: 846
Credit: 157718
RAC: 0

7+ Megs....!!!

Quote:

Ok, so what's the difference between the small, under 500kb and the large 7+ meg WU's???

I was rather surprised to see the first one come through a day or so ago, and I had to wonder.

Collin,

I've never seen one of them that small (500kb), but we need to clear up some terminology here. The 7+Mb file you download and see in your einstein.phys.uwm.edu folder is not a WU. It is a datafile, and the project sends your computer instructions for "slicing" a WU for processing. Several hundred WUs (depending upon how fast you process and return them) can be sliced from the one large datafile, and they can typically reside on your harddrive, being sliced, for about 2 months before being exhausted of work. I recently processed about 200-300 WUs from the r1_1077.0 datafile, from Jan 26 to March 26, and that number could have been much greater, except that I decided to do a ton of work for Rosetta, as the only way (until I finally started receiving treatment) of fighting back against my newly-diagnosed cancer.

Michael R.

edit - You'll see a few of these datafiles in your folder, one "z1_0968.5", another "r1_0217.0", and "r1_1061.5".

microcraft
"The arc of history is long, but it bends toward justice" - MLK

Collin
Collin
Joined: 4 Dec 05
Posts: 19
Credit: 71274
RAC: 0

RE: RE: Ok, so what's the

Message 26835 in response to message 26834

Quote:
Quote:

Ok, so what's the difference between the small, under 500kb and the large 7+ meg WU's???

I was rather surprised to see the first one come through a day or so ago, and I had to wonder.

Collin,

I've never seen one of them that small (500kb), but we need to clear up some terminology here. The 7+Mb file you download and see in your einstein.phys.uwm.edu folder is not a WU. It is a datafile, and the project sends your computer instructions for "slicing" a WU for processing. Several hundred WUs (depending upon how fast you process and return them) can be sliced from the one large datafile, and they can typically reside on your harddrive, being sliced, for about 2 months before being exhausted of work. I recently processed about 200-300 WUs from the r1_1077.0 datafile, from Jan 26 to March 26, and that number could have been much greater, except that I decided to do a ton of work for Rosetta, as the only way (until I finally started receiving treatment) of fighting back against my newly-diagnosed cancer.

Michael R.

edit - You'll see a few of these datafiles in your folder, one "z1_0968.5", another "r1_0217.0", and "r1_1061.5".

Ahh, well that makes more sense. I'll pick stuff up as I go along, being a relative newbie to the project.

Collin
Collin
Joined: 4 Dec 05
Posts: 19
Credit: 71274
RAC: 0

Okay then, according to what

Okay then, according to what you stated, that 7+ meg loaf of bread that BOINC CPDN is slicing should've lasted me longer than through today, where I see a 4.92 meg file downloading.

Perhaps the terminology is a bit murky yet. Is it possible I am doing that many WU's with my dual core in such a short time? I don't see how......

I guess my question would be how your system sides along mine...so I have a baseline for comparison. Otherwise, I'd say I am missing something you take for granted, but I mayn't necessarily.

Erik
Erik
Joined: 14 Feb 06
Posts: 2815
Credit: 2645600
RAC: 0

RE: Okay then, according to

Message 26837 in response to message 26836

Quote:
Okay then, according to what you stated, that 7+ meg loaf of bread that BOINC CPDN is slicing should've lasted me longer than through today, where I see a 4.92 meg file downloading.

You'll not necessarily process every wu from a particular data file. On my faster comp I'll usually see a few in a row with a skip here and there while the slower one jumps around and may only process four or five from a data file.

-Comp 1 - r1_1184_473,(472),(471),...,(466),[skip],(464)-
-Comp 2 - z1_2001_317,(314),(309),[load new data file]-

-some1 correct me if I'm wrong please

Keck_Komputers
Keck_Komputers
Joined: 18 Jan 05
Posts: 376
Credit: 5744955
RAC: 0

RE: RE: Okay then,

Message 26838 in response to message 26837

Quote:
Quote:
Okay then, according to what you stated, that 7+ meg loaf of bread that BOINC CPDN is slicing should've lasted me longer than through today, where I see a 4.92 meg file downloading.

You'll not necessarily process every wu from a particular data file. On my faster comp I'll usually see a few in a row with a skip here and there while the slower one jumps around and may only process four or five from a data file.

-Comp 1 - r1_1184_473,(472),(471),...,(466),[skip],(464)-
-Comp 2 - z1_2001_317,(314),(309),[load new data file]-

-some1 correct me if I'm wrong please


It is also more of a time thing than a number thing. After so long all the workunits in a data file are done (or at least assigned) then you need a new data file. If you have a fast computer you will usually get more workunits per data file than with a slower computer.

BOINC WIKI

BOINCing since 2002/12/8

Collin
Collin
Joined: 4 Dec 05
Posts: 19
Credit: 71274
RAC: 0

So essentially, the .dat

So essentially, the .dat files are sliced loaves of breadthat are handed to group "x" whereby whoever finishes the individual slices sooner, dictates how they get assigned.

I would presume then this is also coordinated by past track record of individual computers as well.

Ok, now it's becoming intelligible.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.