Consider a GPU though, for anything we can use those for.
I'm doing about 20k/ 65 watts (TDP, ignoring PSU and driving it with the cpu). Thats about 300 rac/ watt. The rac/watt/core is pretty bad though! (384 cores on mine).
Well there you go! :-)
I was thinking CPU cores, but yes GPU compute units are yet another benchmark. That's the hazard of unwary apples and oranges. Mind you, did you pay ~ $200 AUD per shader ? :-)
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
About $110 USD for the whole thing, though I did need a new PSU.
It is a bit apples and oranges, but whenever we can run the same tasks on the GPU as we can on a CPU, like so many tasks here, we should be comparing them. From a project standpoint, I hope anything that can be run on the GPU will be due to both the speed and energy efficiency of it.
I do wonder how much CPU wattage you need to run the GPU. 1 free core would be about another 25W for me. (95W/4 cores)
@Stranger7777: You're right. My curiosity was just to see energy efficiency. It's clear that it's not worth to switch from PC to tablet/smartphone for crunching.
Maybe you can get closer to a "full efficiency" (money, RAC, power efficiency) with a cluster of mini PC based on ARM. Not sure it will be worth for crunchers like us, but this ARM processors seem to be really efficient, so that researchers are considering them for the next generation supercomputers (MontBlanc project).
@Mike & Robert: So Robert's i7 is more efficient than the Galaxy note II?
Though energy consumption of the Galaxy was an estimation...
(Wouldn't be more clear to say "credit day/watt/task" instead of using "core"?)
@Chris: 20k RAC/65 watt?! Impressive! But TDP is not the actual consumption right?
My poor smartphone is making a fool of. :(
;)
EDIT: I'm getting a headache with all this numbers! :)
Wait a moment...
Mike used RAC for his Galaxy Note II calculation and he obtained 10 RAC/watt/core.
Apparently less efficient than Robert's i7 (17 daily credit/watt/core).
But if we use the same formula that Robert used (Total daily credits = (86,400 / 2829) * 4 * 62.5 = 7635 credits / day), considering 25500 sec the CPU time of the Note II, than the result differs.
You're right that the Androids are more energy efficient for the given credit outcomes.
But beware of using 'daily credits' when there is a significant difference b/w run time and CPU time ( 100,349.50 vs 25,548.05 for example ). Your relative comparison is still valid, but as a new user you may not be aware of this subtlety : the run time is the clock on the wall, whereas the CPU time is the accumulated time slices used by said task on a given core. So for that task example as above the core is completing a task every ~ 100,000/86,400 = 1.16 days and not every ~ 25500/86400 = 0.30 days .... :-)
[ you may note that 100,000 looks suspiciously like 4 * 25,000 so I'd interpret that as having only one NEON coprocessor to share amongst 4 actual CPU cores ]
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
Let us assume for the moment that all the essential items for processing a work unit are the same per RAC. A desktop is doing more like keeping the hard drive spinning and checking for a dozen things like something plugged into a USB port that phones do not do. Therefore it comes down to a comparison of chip power per instruction but then only if CUDA cards are not considered. Ignoring the CUDA and last I read and it is far from my specialty power per instruction is proportional to the die line width used on the chip.
So no matter what the desktop should always be the more energy intensive for the other things it does not for the number crunching.
So next the question is does a CUDA card reduce the time enough to make up for the extra power consumption. A $30 card or a $400 card? Solid State hard drive instead of a spinning one? And does the cellphone chip have floating point? That gets into the too hard category very quickly.
After saying that I was giving up with calculation, I bought a watt meter! :)
The manual says that the measuring range is from approximately 0,2W to 3600W.
The watt meter does not show decimal numbers (so the minimum shown is 1W).
On my Galaxy Ace 2 charger is written that the output is 5,0V @ 0,7A (that’s 3,5W right?).
During charging (battery 60%) the watt meter shows 4W consumption (not 3,5W due to decimal approximation I believe). Around 98% charge the consumption drops to 2W.
Let’s go to the BOINC test. When the battery was at 100% charge I started BOINC, one core running (I don’t want to stress the smartphone too much). I found that the consumption wasn’t constant, but was switching from 0W to 2W approximately every 40 minutes. I believe that even if Android is telling me that the phone is on full charge, the battery charge is actually dropping a bit and then, below a certain threshold, the charger charges it again at 100%.
Luckily the watt meter shows also the total consumption for the monitoring period.
I left the smartphone crunching overnight for 10 hours and the total consumption was 9 Wh.
So I think that it will be quite correct to say that my Galaxy Ace 2 consumes 0,9 watt crunching Einstein@home with one core. I should do a longer test to get a more precise number between 0,95W and 0,85W.
I don’t want to get in to the credit/watt stuff again, I’m just saying how much it consumes. Maybe I will give it a try with two cores running in the next days.
(btw applying the same formula the result would be 103 credit day/watt using just one core)
After saying that I was giving up with calculation, I bought a watt meter! :)
That's the ticket! Measurement resolves questions. :-)
Quote:
The manual says that the measuring range is from approximately 0,2W to 3600W.
The watt meter does not show decimal numbers (so the minimum shown is 1W).
On my Galaxy Ace 2 charger is written that the output is 5,0V @ 0,7A (that’s 3,5W right?).
Yup, Power = Volts times Amps. But if they're quoting 0.2 Watts at the low end, it might be that there is a 'range' option for the device ie. check the manual ( always RTFM ! ) and see if there is a low/high range selection mechanism.
Quote:
During charging (battery 60%) the watt meter shows 4W consumption (not 3,5W due to decimal approximation I believe). Around 98% charge the consumption drops to 2W.
Yup, expected behaviour.
Quote:
Let’s go to the BOINC test. When the battery was at 100% charge I started BOINC, one core running (I don’t want to stress the smartphone too much). I found that the consumption wasn’t constant, but was switching from 0W to 2W approximately every 40 minutes. I believe that even if Android is telling me that the phone is on full charge, the battery charge is actually dropping a bit and then, below a certain threshold, the charger charges it again at 100%.
Ditto ...
Quote:
Luckily the watt meter shows also the total consumption for the monitoring period.
I left the smartphone crunching overnight for 10 hours and the total consumption was 9 Wh.
So I think that it will be quite correct to say that my Galaxy Ace 2 consumes 0,9 watt crunching Einstein@home with one core. I should do a longer test to get a more precise number between 0,95W and 0,85W.
Quite right, 9 Watt-Hours / 10 Hours = 0.9 Watts
Quote:
I don’t want to get in to the credit/watt stuff again, I’m just saying how much it consumes. Maybe I will give it a try with two cores running in the next days.
(btw applying the same formula the result would be 103 credit day/watt using just one core)
I had much the same idea in mind. On the assumption that the rate limiting part of my Galaxy Note II is the NEON coprocessor - only one to service four cores - then I would expect the same CPU times per core ( ~ 25000 secs ) but less power if only one core is doing it. Meaning that the NEON is always on the go, but it's the cores waiting in turn to access it. Soon see ...
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
After saying that I was giving up with calculation, I bought a watt meter! :)
I should get one of those.
Quote:
I had much the same idea in mind. On the assumption that the rate limiting part of my Galaxy Note II is the NEON coprocessor - only one to service four cores - then I would expect the same CPU times per core ( ~ 25000 secs ) but less power if only one core is doing it. Meaning that the NEON is always on the go, but it's the cores waiting in turn to access it. Soon see ...
Cheers, Mike.
Interesting to know. I also have a Note II; so far I've resisted putting Boinc on it. My previous phone had a lot of trouble with overheating and this one sometimes gets hot too.
David
Miserable old git
Patiently waiting for the asteroid with my name on it.
RE: Consider a GPU though,
)
Well there you go! :-)
I was thinking CPU cores, but yes GPU compute units are yet another benchmark. That's the hazard of unwary apples and oranges. Mind you, did you pay ~ $200 AUD per shader ? :-)
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
About $110 USD for the whole
)
About $110 USD for the whole thing, though I did need a new PSU.
It is a bit apples and oranges, but whenever we can run the same tasks on the GPU as we can on a CPU, like so many tasks here, we should be comparing them. From a project standpoint, I hope anything that can be run on the GPU will be due to both the speed and energy efficiency of it.
I do wonder how much CPU wattage you need to run the GPU. 1 free core would be about another 25W for me. (95W/4 cores)
@Stranger7777: You're right.
)
@Stranger7777: You're right. My curiosity was just to see energy efficiency. It's clear that it's not worth to switch from PC to tablet/smartphone for crunching.
Maybe you can get closer to a "full efficiency" (money, RAC, power efficiency) with a cluster of mini PC based on ARM. Not sure it will be worth for crunchers like us, but this ARM processors seem to be really efficient, so that researchers are considering them for the next generation supercomputers (MontBlanc project).
@Mike & Robert: So Robert's i7 is more efficient than the Galaxy note II?
Though energy consumption of the Galaxy was an estimation...
(Wouldn't be more clear to say "credit day/watt/task" instead of using "core"?)
@Chris: 20k RAC/65 watt?! Impressive! But TDP is not the actual consumption right?
My poor smartphone is making a fool of. :(
;)
EDIT: I'm getting a headache with all this numbers! :)
Wait a moment... Mike used
)
Wait a moment...
Mike used RAC for his Galaxy Note II calculation and he obtained 10 RAC/watt/core.
Apparently less efficient than Robert's i7 (17 daily credit/watt/core).
But if we use the same formula that Robert used (Total daily credits = (86,400 / 2829) * 4 * 62.5 = 7635 credits / day), considering 25500 sec the CPU time of the Note II, than the result differs.
Intel i7:
(86,400 / 2829) * 4 * 62.5 = 7635 credits / day
7635 / 110 watts = 69 Total daily credits / watt
7635 / 110 watts / 4 cores = ~ 17 daily credits / watt / core
Note II:
(86,400 / 25500) * 4 * 62.5 = 847 credits / day
847 / 3 watts = 282 Total daily credits / watt
847 / 3 watts / 4 cores = ~ 70 daily credits / watt / core
And my Galaxy Ace II would be 26 daily credits / watt / core! Yeah!! :)
You're right that the
)
You're right that the Androids are more energy efficient for the given credit outcomes.
But beware of using 'daily credits' when there is a significant difference b/w run time and CPU time ( 100,349.50 vs 25,548.05 for example ). Your relative comparison is still valid, but as a new user you may not be aware of this subtlety : the run time is the clock on the wall, whereas the CPU time is the accumulated time slices used by said task on a given core. So for that task example as above the core is completing a task every ~ 100,000/86,400 = 1.16 days and not every ~ 25500/86400 = 0.30 days .... :-)
[ you may note that 100,000 looks suspiciously like 4 * 25,000 so I'd interpret that as having only one NEON coprocessor to share amongst 4 actual CPU cores ]
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
Understood! Thanks. It ended
)
Understood! Thanks.
It ended up that things are more complicated than I thought at the beginning.
I'm giving up with calculation. :)
Let us assume for the moment
)
Let us assume for the moment that all the essential items for processing a work unit are the same per RAC. A desktop is doing more like keeping the hard drive spinning and checking for a dozen things like something plugged into a USB port that phones do not do. Therefore it comes down to a comparison of chip power per instruction but then only if CUDA cards are not considered. Ignoring the CUDA and last I read and it is far from my specialty power per instruction is proportional to the die line width used on the chip.
So no matter what the desktop should always be the more energy intensive for the other things it does not for the number crunching.
So next the question is does a CUDA card reduce the time enough to make up for the extra power consumption. A $30 card or a $400 card? Solid State hard drive instead of a spinning one? And does the cellphone chip have floating point? That gets into the too hard category very quickly.
After saying that I was
)
After saying that I was giving up with calculation, I bought a watt meter! :)
The manual says that the measuring range is from approximately 0,2W to 3600W.
The watt meter does not show decimal numbers (so the minimum shown is 1W).
On my Galaxy Ace 2 charger is written that the output is 5,0V @ 0,7A (that’s 3,5W right?).
During charging (battery 60%) the watt meter shows 4W consumption (not 3,5W due to decimal approximation I believe). Around 98% charge the consumption drops to 2W.
Let’s go to the BOINC test. When the battery was at 100% charge I started BOINC, one core running (I don’t want to stress the smartphone too much). I found that the consumption wasn’t constant, but was switching from 0W to 2W approximately every 40 minutes. I believe that even if Android is telling me that the phone is on full charge, the battery charge is actually dropping a bit and then, below a certain threshold, the charger charges it again at 100%.
Luckily the watt meter shows also the total consumption for the monitoring period.
I left the smartphone crunching overnight for 10 hours and the total consumption was 9 Wh.
So I think that it will be quite correct to say that my Galaxy Ace 2 consumes 0,9 watt crunching Einstein@home with one core. I should do a longer test to get a more precise number between 0,95W and 0,85W.
I don’t want to get in to the credit/watt stuff again, I’m just saying how much it consumes. Maybe I will give it a try with two cores running in the next days.
(btw applying the same formula the result would be 103 credit day/watt using just one core)
Critics? Suggestions? Ready to listen.
RE: After saying that I was
)
That's the ticket! Measurement resolves questions. :-)
Yup, Power = Volts times Amps. But if they're quoting 0.2 Watts at the low end, it might be that there is a 'range' option for the device ie. check the manual ( always RTFM ! ) and see if there is a low/high range selection mechanism.
Yup, expected behaviour.
Ditto ...
Quite right, 9 Watt-Hours / 10 Hours = 0.9 Watts
I had much the same idea in mind. On the assumption that the rate limiting part of my Galaxy Note II is the NEON coprocessor - only one to service four cores - then I would expect the same CPU times per core ( ~ 25000 secs ) but less power if only one core is doing it. Meaning that the NEON is always on the go, but it's the cores waiting in turn to access it. Soon see ...
Cheers, Mike.
I have made this letter longer than usual because I lack the time to make it shorter ...
... and my other CPU is a Ryzen 5950X :-) Blaise Pascal
RE: RE: After saying that
)
I should get one of those.
Interesting to know. I also have a Note II; so far I've resisted putting Boinc on it. My previous phone had a lot of trouble with overheating and this one sometimes gets hot too.
David
Miserable old git
Patiently waiting for the asteroid with my name on it.