Selecting GPU for Einstein@Home

belczyk
belczyk
Joined: 3 Mar 17
Posts: 3
Credit: 57,046,390
RAC: 0
Topic 210598

I'm considering purchase of a GPU dedicated to Einstein@Home computing. 

Is there a specific thing in GPU specs which makes  it more suitable for einstain@home (or for boinc computing in  general)?

My gaming card is more than I  need so I'd like to buy second GPU to be dedicated to computing. 

I look  at crypt mining rankings and I can see there is a huge difference between ranking when you consider MH/s vs gaming vs workstation.

I wonder if tere is a same difference when considering gaming/workstation vs computing for boinc.  Unfortunately most of the rankings is for gaming or workstation applications. 

Filipe
Filipe
Joined: 10 Mar 05
Posts: 186
Credit: 402,453,261
RAC: 358,905

For a GPU dedicated to

For a GPU dedicated to Einstein@Home, bandwitch is the main spec to look at.

AMD radeon Gpu's are generally better at einstein than those from nvidia.

 

My old R9 280 does 500.000 RAC a day.

 

If you can afford it, the way to go is the new AMD Vega 64 who manage 1.000.000 RAC a day.

You can read the thread dedicated to the Radeon Vega to learn more

 

belczyk
belczyk
Joined: 3 Mar 17
Posts: 3
Credit: 57,046,390
RAC: 0

WOW it's good I asked my GTX

WOW it's good I asked my GTX 1060 gives me only 236K/day 

I'll read more but seems like  Vega 64 is no brainier. We should see non reference Vegas this month so maybe there will be something nice for cruncher. 

 

 

archae86
archae86
Joined: 6 Dec 05
Posts: 3,157
Credit: 7,213,404,931
RAC: 968,746

When the Pascal generation

When the Pascal generation was new, there was a pretty wide power consumption gap between the current Nvidia offerings and the current AMD offerings, and it was in favor of Nvidia.

Depending on your view of power costs, and just what configuration you run, they can be a material component of overall cost of supporting Einstein.

I suspect the new generation AMD chips are much more power-competitive, but I'd still advise that any consideration of "best" cards for Einstein support will correctly be influenced by considering the lifetime cost of added power consumption, along with the more obvious costs.

Do this properly is a bit tricky, as manufacturer numbers are more aimed at pushing people into providing adequate power supply units and system ventilation than they are at actually predicting consumption in a particular application.  Even the extremely detailed power reporting you can see from applications such as HWiNFO is depending on reporting from the card, and in comparing similar reports from different cards, varying portions of the overall consumption may be included, ruining comparison.  Lastly, the card can't know, and never reports, what added power is lost as heat in the PSU, as they always run at less than 100% conversion efficiency.

If you are lucky, someone with a power meter running your application of interest will have taken properly controlled measurements from which you can get a pretty good idea, but short of that it is pretty murky.

 

Jim1348
Jim1348
Joined: 19 Jan 06
Posts: 463
Credit: 257,957,147
RAC: 0

archae86 wrote:If you are

archae86 wrote:
If you are lucky, someone with a power meter running your application of interest will have taken properly controlled measurements from which you can get a pretty good idea, but short of that it is pretty murky.

I can take a stab at it for my GTX 1060 running under Windows 7 64-bit with the 373.06 driver (slightly faster than the later ones I have found).  The card is not overclocked, only the standard factory OC running at 1923 MHz.  It is supported by an i7-4771 with Einstein CPU work on four cores and the other four cores free to support the GPU and for desktop use.

For the "FGRPopencl1K-nvidia" work units, GPU-Z shows that it averages about 67% of TDP, or 80 watts.  But that measures only for a portion of the card as is well known, and the whole thing is probably around 10 watts more.  A more direct measure is the power meter on my UPS, which averages around 226 watts when the card is running and 129 watts when the card is idle, or about 97 watts for the operation of the card.  But that running number jumps around a lot, so it is only an estimate.  Also, the power supply is 90% efficient, so that says 87 watts for the card.  Neither of those numbers accounts for the static power if you pull the card out entirely, which would be around 10 watts.

For that approximately 90 watts of dynamic power to the card for the Einstein GPU work, each work unit takes about 1060 seconds, or about 282,000 PPD. That works out to 3138 PPD/watt.

So you can compare on that basis if you wish.

belczyk
belczyk
Joined: 3 Mar 17
Posts: 3
Credit: 57,046,390
RAC: 0

Jim1348 wrote:     It

Jim1348 wrote:

  It is supported by an i7-4771 with Einstein CPU work on four cores and the other four cores free to support the GPU and for desktop use.

On MSI GTX 1060 GAMING + (Core is about 1930).

From my tests more than 0.1-0.2 core per GPU WU is not giving you any benefit. 

So four cores to support GPU is overkill.

I also run 4 WU on one GPU. 

This reduced time for WU from about 1100 to 880 (single unit takes about 3500s but I run four in parallel, so in this time I get 4 units done).

Thanks to this tweek I incressed PPD from 260 000 to about 340 000 PPD.

My app_config looks like this (not sure about app name): 

<app_config> <app> <name>hsgamma_FGRPB1</name> <gpu_versions> <gpu_usage>.25</gpu_usage> <cpu_usage>.125</cpu_usage> </gpu_versions> </app> </app_config>

Jim1348
Jim1348
Joined: 19 Jan 06
Posts: 463
Credit: 257,957,147
RAC: 0

sbelczyk wrote:From my tests

sbelczyk wrote:

From my tests more than 0.1-0.2 core per GPU WU is not giving you any benefit. 

So four cores to support GPU is overkill.

Yes, but I didn't say I was reserving the 4 cores for that reason.  I said they were available, to show that the GPU was not limited by CPU power.  And I usually prefer to not run more than one WU per GPU.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.