AMD 5800X3D, magic cache or pricy disappointment?

Keith Myers
Keith Myers
Joined: 11 Feb 11
Posts: 4704
Credit: 17549456229
RAC: 6433340

My teammate just picked one

My teammate just picked one up for $150 on eBay with only a few USB 2.0 header pins bent or damaged.  Won't impact his ability to run 4 gpus.

Depends on the project whether PCIE bus speed impacts a task.  The only project I know that suffers at X4 is GPUgrid. Everything else is kosher at X4 or X8.

 

Peter van Kalleveen
Peter van Kalleveen
Joined: 15 Jan 19
Posts: 45
Credit: 250329645
RAC: 0

Lucky team mate, that's a

Lucky team mate, that's a steal.

I once had the aspiration of building a dream gpu compute rig who could earn his way 50% of the time with crypto mining and the other 50% of the time running compute for einstein@home to of get a place in the top 20 and do lots of sciences.

So i started with my old workstation 2950x threadripper cpu so to have enough lanes and plenty cpu power for lots of gpu's, a very deep 4u server/mining chassis who could fit 8 gpu's in a separate part in the chassis. 

But then the trouble started, 4 pcie slots on the mobo so throw in some bitfubrication risers to create x4x4x4x4 or x8x8 slots. Then running x4 or x8 pcie flatband cables it gets really messy fast.

Long flat band pcie extender cables after a bitfubrication riser make it difficult to maintain pcie gen3 integrity, so in the end i just gave up on the idea.

Its a shame because if all the crypto mining stuff wasn't pcie gen1 x1 it could really help build some amazing compute rigs.

 

Anyway final configuration for this moment in time has become:

5800x3d with 3600mhz cl16 memory running only GPU tasks for 3x RTX A4000 with 2WU per gpu.

takes around 11 min per WU of 03AS while running 6 in parallel.

So gonna leave it like this for a week or two and see if it can steadily keep cruncing out the tasks.

Maybe after that i will ty a full AMD build with switching the GPU's for the Pro VII + 2 w5700 cards.

mikey
mikey
Joined: 22 Jan 05
Posts: 11889
Credit: 1828201331
RAC: 202315

Peter van Kalleveen

Peter van Kalleveen wrote:

Lucky team mate, that's a steal.

I once had the aspiration of building a dream gpu compute rig who could earn his way 50% of the time with crypto mining and the other 50% of the time running compute for einstein@home to of get a place in the top 20 and do lots of sciences.

So i started with my old workstation 2950x threadripper cpu so to have enough lanes and plenty cpu power for lots of gpu's, a very deep 4u server/mining chassis who could fit 8 gpu's in a separate part in the chassis. 

But then the trouble started, 4 pcie slots on the mobo so throw in some bitfubrication risers to create x4x4x4x4 or x8x8 slots. Then running x4 or x8 pcie flatband cables it gets really messy fast.

Long flat band pcie extender cables after a bitfubrication riser make it difficult to maintain pcie gen3 integrity, so in the end i just gave up on the idea.

Its a shame because if all the crypto mining stuff wasn't pcie gen1 x1 it could really help build some amazing compute rigs.

 

Anyway final configuration for this moment in time has become:

5800x3d with 3600mhz cl16 memory running only GPU tasks for 3x RTX A4000 with 2WU per gpu.

takes around 11 min per WU of 03AS while running 6 in parallel.

So gonna leave it like this for a week or two and see if it can steadily keep cruncing out the tasks.

Maybe after that i will ty a full AMD build with switching the GPU's for the Pro VII + 2 w5700 cards.

If you had some older Asic miners would that offset the price given the current price of crypto coins? No they could not be run within your pc box, well the "Butterfly" ones could, but the question is do standalone Asic miners make enough crypto stuff to offset a current gpu?

Ronald McNichol
Ronald McNichol
Joined: 28 Feb 22
Posts: 27
Credit: 99853798
RAC: 0

Keith Myers wrote: MSI X570

Keith Myers wrote:

MSI X570 Godlike comes to mind with 4 full length PCI slots spaced 2 card slot widths apart. Would support 4 gpus natively on the board without extenders.

 

I would worry about the air intake for cooling of each blocked by the card below it. All but the lowest one, which in turn could be blocked by the power supply.

I just have one 6800XT and have made a hole in the side panel to allow the exhaust from the cooler to escape outside rather than being bounced back to the low-pressure area of the intakes. I think my MB allows for 2 dedicated GPUs, but with a goodly distance between each of them, and the power supply below.

 

Keith Myers
Keith Myers
Joined: 11 Feb 11
Posts: 4704
Credit: 17549456229
RAC: 6433340

I have populated 4 double

I have populated 4 double wide cards in mobos before. Best scenario is to use hybrid cards where the majority of the cooling is provided off the card in locations that are easy to shed the heat, top, rear are the best.

 

ace_quaker
ace_quaker
Joined: 21 May 06
Posts: 6
Credit: 353390997
RAC: 1525630

Back before videocards went

Back before videocards went apeshit with power use and the defaults came with blower style coolers I ran 4x GTX 780s in a single case in an x99 system. Granted it had good airflow with quite a few 120mm fans with more than half as intakes.

Not many blower coolers on the consumer RTX line that I know of, you'd have to step up to the workstation card line RTX Axxxx. 

The A4000 is the sweet spot, 16GB of ECC ram.  Performance sits around a 3070 but power use is about half. The higher initial price compared to the consumer RTX line will be made up in power savings after ~3 years assuming you run it 24/7 and pay a decent amount for electricity and air conditioning.  Performance per watt is outstanding.

Ian&Steve C.
Ian&Steve C.
Joined: 19 Jan 20
Posts: 3681
Credit: 33844345855
RAC: 36714155

the best performance per watt

the best performance per watt comes from the models with GDDR6X (3070Ti-3080Ti), or HBM/2 (TitanV, Radeon VII). Einstein Gamma ray is fairly GPU memory bound. faster memory performs better. the Ampere cards with GDDR6 perform about on par with the Turing GDDR6 cards.

_________________________________________________________________________

Peter van Kalleveen
Peter van Kalleveen
Joined: 15 Jan 19
Posts: 45
Credit: 250329645
RAC: 0

No idea, have never owned a

No idea, have never owned a asic miner.

For what i understand that the old second hand often cost more in electricity than they make.

Only the expensive newer models give profit.

That's the big benefit of gpu's, there general purpose compute.

Peter van Kalleveen
Peter van Kalleveen
Joined: 15 Jan 19
Posts: 45
Credit: 250329645
RAC: 0

A small update of the

A small update of the 5800x3d.

The rig of course went some true some different incarnations in the meantime.

But eventually i settled on a mini itx build with the radeon pro VII.

This does 3 WU per 11-12 min, and only has a total system draw of around 280W measured at the wall. But most notable is that the system has 0 invalid tasks at around 1500 done tasks. On previous systems i always got some invalid tasks now and then but this seems very solid.

Don't use ECC dram and have not turned on ECC on the HBM on the VII.

I does only get gravitational wave 03 tasks, for stacking up score its probably better to do gamma ray, but the gravitational waves is probably more interesting for finding discovery's.

mikey
mikey
Joined: 22 Jan 05
Posts: 11889
Credit: 1828201331
RAC: 202315

Peter van Kalleveen

Peter van Kalleveen wrote:

No idea, have never owned a asic miner.

For what i understand that the old second hand often cost more in electricity than they make.

Only the expensive newer models give profit.

That's the big benefit of gpu's, there general purpose compute.

Thanks for the reply, I have a box full of little asic miners I got for the cost of shipping them, they are a mixture of new and used, and a couple of Butterfly ones that look like radiator cooling supplies and a couple of stand alone ones, I think they are model #3 Asics, as well that take 1200watt psu's each to power.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.