Standing up BOINC end-user "Farms"

Tom M
Tom M
Joined: 2 Feb 06
Posts: 5644
Credit: 7725679502
RAC: 2350168

Locations Unknown

Locations Unknown wrote:

Thanks... I'm primarily crunching on E@H right now because discovering a Pulsar and getting a framed discovery certificate is pretty cool.

How on earth are you driving 12 gpu's with a 4 core CPU?

Tom M

A Proud member of the O.F.A.  (Old Farts Association).  Be well, do good work, and keep in touch.® (Garrison Keillor)

Ian&Steve C.
Ian&Steve C.
Joined: 19 Jan 20
Posts: 3709
Credit: 34640176651
RAC: 42327359

Because mining didn’t care

Because mining didn’t care about the CPU. 
 

he’ll get better performance on BOINC with a better CPU though. 

_________________________________________________________________________

Locations Unknown
Locations Unknown
Joined: 9 Jan 07
Posts: 8
Credit: 1167710397
RAC: 2569

Correct - I was using a

Correct - I was using a special mining motherboard that has 13 PCIE slots.  For crypto mining you just need the cheapest CPU money can buy.  A year ago, I was running 13 RX 580's on a Intel Celeron 2 core CPU.  

I'm in the process of splitting up the 12 GPU rig.

Tom M
Tom M
Joined: 2 Feb 06
Posts: 5644
Credit: 7725679502
RAC: 2350168

Locations Unknown

Locations Unknown wrote:

Correct - I was using a special mining motherboard that has 13 PCIE slots.  For crypto mining you just need the cheapest CPU money can buy.  A year ago, I was running 13 RX 580's on a Intel Celeron 2 core CPU.  

I'm in the process of splitting up the 12 GPU rig.

Ah. If that is an lga 1151 socket version 2 you can get a pretty respectable 8c/16t CPU that should come close to pushing all 12 gpus at full speed.

Tom M

 

A Proud member of the O.F.A.  (Old Farts Association).  Be well, do good work, and keep in touch.® (Garrison Keillor)

Locations Unknown
Locations Unknown
Joined: 9 Jan 07
Posts: 8
Credit: 1167710397
RAC: 2569

I'm looking to upgrade to

I'm looking to upgrade to Intel 9th gen Asus boards (6 PCIE slots).  Can get the boards on Ebay for about $60 used and I have access to used CPU's at a pretty cheap price.  The theory of running 13 gpu's on one board sounds great, but the reality was constant headaches trying to keep everything running 24/7.  I also had to get creative with my system power management.  With the 13 RX 6600xt's (13 6+8) and 13 PCIE risers (PCIE 6 or Molex), I didn't have enough options to power everything without burning my house down.  I ended up with 3 Corsair PSU's daisy chained together (2x 1200 watt PSUs for the GPU's and a 750 watt PSU for the risers.)  It was fun problem solving all the headaches but just too much effort to keep stable.  I picked up a bunch of 12 GPU double deck mining frames on Amazon for cheap and am building out 6 GPU rigs and stacking them.  The positive of mining for all these years is I literally have a PC repair shop in my basement - so many spare parts and components, I rarely need to buy anything to finish a build out.     

Tom M
Tom M
Joined: 2 Feb 06
Posts: 5644
Credit: 7725679502
RAC: 2350168

Hear you on stability issues!

Hear you on stability issues!

A Proud member of the O.F.A.  (Old Farts Association).  Be well, do good work, and keep in touch.® (Garrison Keillor)

Tigers_Dave
Tigers_Dave
Joined: 25 Mar 09
Posts: 228
Credit: 9248626727
RAC: 0

Locations Unknown wrote: 13

Locations Unknown wrote:

13 RX 6600xt's (13 6+8)

What's your opinion of the RX 6600 XT?  My RX 6600 XTs (single 8-pin power connector) are less productive than my RX Vega 56s, RX Vega 64s, and my RX 5700 XTs.  Of course, my setup (Intel Macs + eGPUs) is considerably different from yours.  Moreover, my RX 6600 XTs do draw less power than those other cards.

"I was born in a small town, and I live in a small town." - John Mellencamp

mikey
mikey
Joined: 22 Jan 05
Posts: 11944
Credit: 1832538051
RAC: 218109

Tigers_Dave wrote: Locations

Tigers_Dave wrote:

Locations Unknown wrote:

13 RX 6600xt's (13 6+8)

What's your opinion of the RX 6600 XT?  My RX 6600 XTs (single 8-pin power connector) are less productive than my RX Vega 56s, RX Vega 64s, and my RX 5700 XTs.  Of course, my setup (Intel Macs + eGPUs) is considerably different from yours.  Moreover, my RX 6600 XTs do draw less power than those other cards. 

I have one AMD 6600, NOT an XT model though, and it runs tasks at:

Completed and validated 498 175 3,465

Gamma-ray pulsar binary search #1 on GPUs v1.28 () windows_x86_64

It is in an I7-4770 quad core cpu with HT that is using 4 cpu cores for a PrimeGrid task and 2 other cpu cores for another Project. It runs in the mid 70's C for heat for the average task, I only run one task at a time.

Tigers_Dave
Tigers_Dave
Joined: 25 Mar 09
Posts: 228
Credit: 9248626727
RAC: 0

mikey wrote: Tigers_Dave

mikey wrote:

Tigers_Dave wrote:

Locations Unknown wrote:

13 RX 6600xt's (13 6+8)

What's your opinion of the RX 6600 XT?  My RX 6600 XTs (single 8-pin power connector) are less productive than my RX Vega 56s, RX Vega 64s, and my RX 5700 XTs.  Of course, my setup (Intel Macs + eGPUs) is considerably different from yours.  Moreover, my RX 6600 XTs do draw less power than those other cards. 

I have one AMD 6600, NOT an XT model though, and it runs tasks at:

Completed and validated

498

175

3,465

Gamma-ray pulsar binary search #1 on GPUs v1.28 () windows_x86_64

It is in an I7-4770 quad core cpu with HT that is using 4 cpu cores for a PrimeGrid task and 2 other cpu cores for another Project. It runs in the mid 70's C for heat for the average task, I only run one task at a time.

 

Mikey, thank you so much for sharing, as your data are very surprising (to me at least).  My RX 6600 XTs have an average run time of 1028 seconds and an average CPU time of 181 seconds.  My RX 6600s have an average run time of 1139 seconds and an average CPU time of 164 seconds.  But I run two tasks at a time, suggesting that my RX 6600 XTs and RX 6600s are only a little bit less productive than your RX 6600.  This surprises me, as I had thought that the TB3 connection to my eGPUs greatly throttled their E@H productivity, particularly in setups where I had three or four eGPUs connected to a Mac.  Your data suggests that the loss of productivity is less than 10%.  Hmmm.  Again, thank you for sharing the data.

"I was born in a small town, and I live in a small town." - John Mellencamp

Ian&Steve C.
Ian&Steve C.
Joined: 19 Jan 20
Posts: 3709
Credit: 34640176651
RAC: 42327359

PCie bandwidth does not

PCie bandwidth does not matter to Einstein, to a degree. A TB3 connection is more than enough. I have tested this extensively.

when I first started crunching Einstein a few years ago, some people claimed that you'd see a slowdown with less PCIe link widths/bandwidth, but they didnt have any data to back it up, just word of mouth or anecdotes, and few people even know HOW to test PCIe bandwidth use.

So I tested it myself and never found more than a few percent of PCIe used. That's not to say you'll never hit a limit, as maybe something crazy small like PCie 1.0 x1 could be a bottleneck, but certainly fine for all reasonable modern PCIe Gen 2 and Gen 3 links. If PCIe was ever a limit here at Einstein, it must have been with older apps not in use today. Or people didnt realize another variable in their own testing. 

your TB3 connection (PCIe gen 3.0 x4) is more than enough to satisfy the bandwidth requirements of Einstein's current apps. It's possible that there's some latency impact with TB3, but that's just speculation on my part. that 10% could easily come from different power limits (and clocks) of different card models, or just the nature of MacOS app (v1.17) vs Windows app (newer v1.28). or even improvements in the drivers. 

_________________________________________________________________________

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.