...So if you check what application is _really_ running at run time, for example, using Windows Task Manager, you'll see einstein_S5R2_4.40_windows_intelx86.exe.
...Lastly, if you actually look for executables in the project directory, you'll find the 4.40 actually there, and the 4.38 gone, so that makes it really clear which one is running your work.
Thanks! The switchover worked perfectly on my home machine! I did the two checks you mention above, and 4.40 is indeed doing the work. Thank you very much for the great explanation!
I started a new result with v4.40 about an hour ago and it seems to be running fine. But, it is "way too early" for me to tell anything about "speed".
The above result has now run about 13.4 hours and is 16.7% complete. That extrapolates to about 80 hours. If it actually takes that long, it will be about 3 hours slower than my last v4.38 result on this P4 Prescott. The WU's appear to be similar, but I guess this new one could turn out to be a bigger "Monster" than the first.
These 4 monster WU's were completed with 4.40, with anywhere form 10-70% of the WU crunched with the official 4.38 app.
2 WU's have validated while the other 2 are waiting on wingmen.
Crunch time appears to be a bit shorter, but I'll need more results to complete to be certain. The first 100% 4.40 WU will be complete in an hour and a half and I'll report back then.
There are 10^11 stars in the galaxy. That used to be a huge number. But it's only a hundred billion. It's less than the national deficit! We used to call them astronomical numbers. Now we should call them economical numbers. - Richard Feynman
100% crunched with 4.40, still pending.
Still looks to be a bit faster than 4.38, but slower than 4.33.
There are 10^11 stars in the galaxy. That used to be a huge number. But it's only a hundred billion. It's less than the national deficit! We used to call them astronomical numbers. Now we should call them economical numbers. - Richard Feynman
The Quad result appears to come right in sequence from ones for which the 4.33 CPU time was about 87,000 CPU seconds, and the 4.38 time about 98,900. So the time of 97688 is a step back toward faster, but a pretty small step.
The Duo result is slightly less closely related to preceding work, but the credit claim is nearly identical. Assuming that work is well correlated with credit claim, then we have ones for which the 4.33 CPU time was about 57,000 CPU seconds, and the 4.38 time about 65,180. So the time of 64259 is also a small step in the good direction.
I've also reported in a few mixed-application results, with no self-reported trouble. No validations, either.
The Quad result appears to come right in sequence from ones for which the 4.33 CPU time was about 87,000 CPU seconds, and the 4.38 time about 98,900. So the time of 97688 is a step back toward faster, but a pretty small step.
The Duo result is slightly less closely related to preceding work, but the credit claim is nearly identical. Assuming that work is well correlated with credit claim, then we have ones for which the 4.33 CPU time was about 57,000 CPU seconds, and the 4.38 time about 65,180. So the time of 64259 is also a small step in the good direction.
I've also reported in a few mixed-application results, with no self-reported trouble. No validations, either.
Nice report!
Mine are running smoothly as well, but none have yet completed.
And I assume these "monster" workunits are all approximately the same size? So we are comparing apples to apples, so to speak, in terms of run times?
And I assume these "monster" workunits are all approximately the same size? So we are comparing apples to apples, so to speak, in terms of run times?
in comparing 4.33|4.38|4.40, yes, but not in comparing the Quad to the Duo.
The Duo has had a credit range from 440.01 to 440.58 over the past three weeks, while the Quad has chewed on big ones since the day I turned it on in late July, covering a range from 655.53 to 655.56 in recent weeks.
That was especially annoying for a while, as the Quad apparently represented more than half the computing power of hosts getting assigned results from that series, so for days at a time it would get the _0 result for each sequential WU. You can imagine how long I waited for validation. As we near the end of S5R2, more and more hosts are getting assigned work from this series. At least half the time, I don't get the _0 result now.
My wu has been processing for about 30 hrs, the last 8hrs with 4.40. I'd have thought that 8 hrs was time enough for the time estimate to have settled down, but it seems to be projecting 2 steps forward and 1 back. The seconds part of the 'to completion' column on the Tasks tab typically goes something like 24, 25, 26, 18, 19, 20, 13, 14, 15, 10, 11, .....
Weird.
My wu has been processing for about 30 hrs, the last 8hrs with 4.40. I'd have thought that 8 hrs was time enough for the time estimate to have settled down, but it seems to be projecting 2 steps forward and 1 back. The seconds part of the 'to completion' column on the Tasks tab typically goes something like 24, 25, 26, 18, 19, 20, 13, 14, 15, 10, 11, .....
Weird.
This is quite normal. The GUI is updated once per second, but the app will report progress more typically about (roughly) every 2-4 seconds depending on you PCs speed. When an update hapens without a progress detected, the time will get up, after next progress "click" it will get down.
This is quite normal. The GUI is updated once per second, but the app will report progress more typically about (roughly) every 2-4 seconds depending on you PCs speed. When an update hapens without a progress detected, the time will get up, after next progress "click" it will get down.
CU
H-BE
Thanks for the info Bikeman. I hadn't noticed that before.
RE: ...So if you check what
)
Thanks! The switchover worked perfectly on my home machine! I did the two checks you mention above, and 4.40 is indeed doing the work. Thank you very much for the great explanation!
RE: I started a new result
)
The above result has now run about 13.4 hours and is 16.7% complete. That extrapolates to about 80 hours. If it actually takes that long, it will be about 3 hours slower than my last v4.38 result on this P4 Prescott. The WU's appear to be similar, but I guess this new one could turn out to be a bigger "Monster" than the first.
http://einstein.phys.uwm.edu/
)
http://einsteinathome.org/task/86772907
http://einsteinathome.org/task/86771835
http://einsteinathome.org/task/86762576
http://einsteinathome.org/task/86751445
These 4 monster WU's were completed with 4.40, with anywhere form 10-70% of the WU crunched with the official 4.38 app.
2 WU's have validated while the other 2 are waiting on wingmen.
Crunch time appears to be a bit shorter, but I'll need more results to complete to be certain. The first 100% 4.40 WU will be complete in an hour and a half and I'll report back then.
There are 10^11 stars in the galaxy. That used to be a huge number. But it's only a hundred billion. It's less than the national deficit! We used to call them astronomical numbers. Now we should call them economical numbers. - Richard Feynman
http://einstein.phys.uwm.edu/
)
http://einsteinathome.org/task/86774658
100% crunched with 4.40, still pending.
Still looks to be a bit faster than 4.38, but slower than 4.33.
There are 10^11 stars in the galaxy. That used to be a huge number. But it's only a hundred billion. It's less than the national deficit! We used to call them astronomical numbers. Now we should call them economical numbers. - Richard Feynman
I have "pure results" from a
)
I have "pure results" from a Core 2 Duo E6600
and from a Core 2 Quad Q6600
The Quad result appears to come right in sequence from ones for which the 4.33 CPU time was about 87,000 CPU seconds, and the 4.38 time about 98,900. So the time of 97688 is a step back toward faster, but a pretty small step.
The Duo result is slightly less closely related to preceding work, but the credit claim is nearly identical. Assuming that work is well correlated with credit claim, then we have ones for which the 4.33 CPU time was about 57,000 CPU seconds, and the 4.38 time about 65,180. So the time of 64259 is also a small step in the good direction.
I've also reported in a few mixed-application results, with no self-reported trouble. No validations, either.
RE: I have "pure results"
)
Nice report!
Mine are running smoothly as well, but none have yet completed.
And I assume these "monster" workunits are all approximately the same size? So we are comparing apples to apples, so to speak, in terms of run times?
RE: And I assume these
)
in comparing 4.33|4.38|4.40, yes, but not in comparing the Quad to the Duo.
The Duo has had a credit range from 440.01 to 440.58 over the past three weeks, while the Quad has chewed on big ones since the day I turned it on in late July, covering a range from 655.53 to 655.56 in recent weeks.
That was especially annoying for a while, as the Quad apparently represented more than half the computing power of hosts getting assigned results from that series, so for days at a time it would get the _0 result for each sequential WU. You can imagine how long I waited for validation. As we near the end of S5R2, more and more hosts are getting assigned work from this series. At least half the time, I don't get the _0 result now.
My wu has been processing for
)
My wu has been processing for about 30 hrs, the last 8hrs with 4.40. I'd have thought that 8 hrs was time enough for the time estimate to have settled down, but it seems to be projecting 2 steps forward and 1 back. The seconds part of the 'to completion' column on the Tasks tab typically goes something like 24, 25, 26, 18, 19, 20, 13, 14, 15, 10, 11, .....
Weird.
Mike
RE: My wu has been
)
This is quite normal. The GUI is updated once per second, but the app will report progress more typically about (roughly) every 2-4 seconds depending on you PCs speed. When an update hapens without a progress detected, the time will get up, after next progress "click" it will get down.
CU
H-BE
RE: This is quite normal.
)
Thanks for the info Bikeman. I hadn't noticed that before.
Mike