I wonder if I could talk gravywavy into asking me for Wiki access so he can help *ME* out ...
He would make an excellent choice as he always writes with clarity and perception. Its a pretty thankless task I'm dobbing him in for but he seems to have the ability to produce valuable insights into many areas of knowledge and it would be very useful to have him contributing. I'll go seek him out and ask him nicely :).
Paul suggested in an other thread that T27 in the wu-name would indentify the max wu's per 'Work Unit Data File' (in short WUDF).
As I have been crunching on a T28, the max would be 28 (or more)
But if I look at the wu spawning from a S4 WUDF, than I see the digit after the 2nd dot increasing (so far the Txx is constant T01). Looks like there are more wu's in a WUDF or to see the max Txx value as max is incorrect.
NB: WUDF file sizes have been reduced from 14+ for S3 to 6+ for S4 l1_ (don't know for w1_). In Boinc .pdf documentation, found in Paul's Wiki, I have read that the WUDF has been 40 Mb, which would have been very modem-user unfriendly!
Paul suggested in an other thread that T27 in the wu-name would indentify the max wu's per 'Work Unit Data File' (in short WUDF).
As I have been crunching on a T28, the max would be 28 (or more)
But if I look at the wu spawning from a S4 WUDF, than I see the digit after the 2nd dot increasing (so far the Txx is constant T01). Looks like there are more wu's in a WUDF or to see the max Txx value as max is incorrect.
NB: WUDF file sizes have been reduced from 14+ for S3 to 6+ for S4 l1_ (don't know for w1_). In Boinc .pdf documentation, found in Paul's Wiki, I have read that the WUDF has been 40 Mb, which would have been very modem-user unfriendly!
When I look at Work Unit Data File it says 14.1 MB ...
grid_paper_04.pdf
BOINC: A System for Public-Resource Computing and Storage
David P. Anderson
Space Sciences Laboratory
University of California at Berkeley
4 Conclusion
We have described the public-resource computing
paradigm, contrasted it with Grid computing, and presented
the design of a software system, BOINC, that facilitates
it. BOINC is being used by several existing
projects (SETI@home, Predictor@home, climateprediction.
net) and by several other projects in development.
Many areas of the BOINC design are incomplete. For
example, some projects require efficient data replication:
Einstein@home uses large (40 MB) input files, and a given
input file may be sent to a large number of hosts (in contrast
with projects like SETI@home, where each input file
is different). In its initial form, Einstein@home will simply
send the files separately to each host, using a system of
replicated data servers. Eventually we plan to use a mechanism
such as BitTorrent [3] for efficiently replicating files
using peer-to-peer communication.
Ah, that paper. Well, it is nothing I wrote or researched. At the moment we are only using 14M files, or newer 6 M files, though I have not looked at those yet,
Hmm, W1 files are 8 M, and the l1 ("L1") files are 6 M ... so, maybe there are varying sizes... UP TO 40 M ...
I wonder if Bruce wanders by if he will enlighten us ...
Hmm, w1 files are 8 M, and the l1 ("L1") files are 6 M ... so, maybe there are varying sizes... UP TO 40 M ...
I wonder if Bruce wanders by if he will enlighten us ...
Yes, would be nice to have some insight into the data being crunched, how it is crunched and validated etc. Lack of good E@H specific documentation, to find answers to these Q's is not making it easier. Good you are working on that, but don't expect you to cover every posible Q.
Think, Bruce has his priorities.
If need be he can be damned quick, as I found out in yesterdays 'validation errors'
In relation to the wu's per wudf question I must say however that I'm not disappointed about the quantity. Now we are running on fresh wudf's I did not have to download a new wudf for days. Actually I had to go for 'No new work' and finish outstanding wu's in order to take a few days off.
Hmm, w1 files are 8 M, and the l1 ("L1") files are 6 M ... so, maybe there are varying sizes... UP TO 40 M ...
I wonder if Bruce wanders by if he will enlighten us ...
Yes, would be nice to have some insight into the data being crunched, how it is crunched and validated etc. Lack of good E@H specific documentation, to find answers to these Q's is not making it easier. Good you are working on that, but don't expect you to cover every posible Q.
Think, Bruce has his priorities.
If need be he can be damned quick, as I found out in yesterdays 'validation errors'
Well, I still have hopes I can convince the projects to ask for a Wiki account so they can add project specific information. Chris Randles has done a supurb job on the CPDN material we have now. As good or better than the project's FAQ... :)
Especially in that we can tie the pages to the other documentation in the Wiki. So that you have one stop shopping as it were ... It also would save each Project from writing up the same type of troubleshooting guide ... For example, we now have several "How-To"s STARTED, and as more people read them and suggest additions the better they will get. We have one, for example, on doing some network trouble shooting. I grant that it is sketchy, in part because I don't have all the material moved over from the old documentation site (my refrain for the morning, one in e-mail and now on the second message board)...
But, as of now, I have almost all of the Application Owner's Manual ported, all of the Glossary, most of the UCB developers site (3-4 pages to go I think), a good part of the BOINC FAQ ...
Still to go, BOINC Web Site Owner's Manual (not started yet, but usage on the old site says that it is not used much)...
Just peeked, nearly 30-40 changes made by my loyal assistants ... :)
Go Team! Go! Go! Go! ... :)
I am still focused on migration and they are doing clean-up of the material I miss updating (I do try to update the material as much as possible as I migrate it ) ...
RE: Thank you ... :) You
)
You are most welcome...
He would make an excellent choice as he always writes with clarity and perception. Its a pretty thankless task I'm dobbing him in for but he seems to have the ability to produce valuable insights into many areas of knowledge and it would be very useful to have him contributing. I'll go seek him out and ask him nicely :).
Cheers,
Gary.
Thread-update on the
)
Thread-update on the issue:
Paul suggested in an other thread that T27 in the wu-name would indentify the max wu's per 'Work Unit Data File' (in short WUDF).
As I have been crunching on a T28, the max would be 28 (or more)
But if I look at the wu spawning from a S4 WUDF, than I see the digit after the 2nd dot increasing (so far the Txx is constant T01). Looks like there are more wu's in a WUDF or to see the max Txx value as max is incorrect.
NB: WUDF file sizes have been reduced from 14+ for S3 to 6+ for S4 l1_ (don't know for w1_). In Boinc .pdf documentation, found in Paul's Wiki, I have read that the WUDF has been 40 Mb, which would have been very modem-user unfriendly!
RE: Thread-update on the
)
Um, where did you see that?
When I look at Work Unit Data File it says 14.1 MB ...
RE: Um, where did you see
)
grid_paper_04.pdf
BOINC: A System for Public-Resource Computing and Storage
David P. Anderson
Space Sciences Laboratory
University of California at Berkeley
4 Conclusion
We have described the public-resource computing
paradigm, contrasted it with Grid computing, and presented
the design of a software system, BOINC, that facilitates
it. BOINC is being used by several existing
projects (SETI@home, Predictor@home, climateprediction.
net) and by several other projects in development.
Many areas of the BOINC design are incomplete. For
example, some projects require efficient data replication:
Einstein@home uses large (40 MB) input files, and a given
input file may be sent to a large number of hosts (in contrast
with projects like SETI@home, where each input file
is different). In its initial form, Einstein@home will simply
send the files separately to each host, using a system of
replicated data servers. Eventually we plan to use a mechanism
such as BitTorrent [3] for efficiently replicating files
using peer-to-peer communication.
Ah, that paper. Well, it is
)
Ah, that paper. Well, it is nothing I wrote or researched. At the moment we are only using 14M files, or newer 6 M files, though I have not looked at those yet,
Hmm, W1 files are 8 M, and the l1 ("L1") files are 6 M ... so, maybe there are varying sizes... UP TO 40 M ...
I wonder if Bruce wanders by if he will enlighten us ...
RE: Hmm, w1 files are 8 M,
)
Yes, would be nice to have some insight into the data being crunched, how it is crunched and validated etc. Lack of good E@H specific documentation, to find answers to these Q's is not making it easier. Good you are working on that, but don't expect you to cover every posible Q.
Think, Bruce has his priorities.
If need be he can be damned quick, as I found out in yesterdays 'validation errors'
In relation to the wu's per wudf question I must say however that I'm not disappointed about the quantity. Now we are running on fresh wudf's I did not have to download a new wudf for days. Actually I had to go for 'No new work' and finish outstanding wu's in order to take a few days off.
RE: RE: Hmm, w1 files are
)
Well, I still have hopes I can convince the projects to ask for a Wiki account so they can add project specific information. Chris Randles has done a supurb job on the CPDN material we have now. As good or better than the project's FAQ... :)
Especially in that we can tie the pages to the other documentation in the Wiki. So that you have one stop shopping as it were ... It also would save each Project from writing up the same type of troubleshooting guide ... For example, we now have several "How-To"s STARTED, and as more people read them and suggest additions the better they will get. We have one, for example, on doing some network trouble shooting. I grant that it is sketchy, in part because I don't have all the material moved over from the old documentation site (my refrain for the morning, one in e-mail and now on the second message board)...
But, as of now, I have almost all of the Application Owner's Manual ported, all of the Glossary, most of the UCB developers site (3-4 pages to go I think), a good part of the BOINC FAQ ...
Still to go, BOINC Web Site Owner's Manual (not started yet, but usage on the old site says that it is not used much)...
Just peeked, nearly 30-40 changes made by my loyal assistants ... :)
Go Team! Go! Go! Go! ... :)
I am still focused on migration and they are doing clean-up of the material I miss updating (I do try to update the material as much as possible as I migrate it ) ...