13-year old ATI HD4670... Worth crunching...?

Elphidieus
Elphidieus
Joined: 20 Feb 05
Posts: 245
Credit: 20603702
RAC: 0
Topic 225646

Hey guys... I've digged out a dinosaur PC with an ATI HD4670... Do you guys think it is still able to crunch WU...?

San-Fernando-Valley
San-Fernando-Valley
Joined: 16 Mar 16
Posts: 260
Credit: 6910151637
RAC: 21910420

This might

Matt White
Matt White
Joined: 9 Jul 19
Posts: 120
Credit: 280798376
RAC: 0

I'm running some old GPU

I'm running some old GPU cards which still work nicely on the Gamma Ray tasks. Granted, they are not as fast as a modern card, but some tasks crunched is better than zero tasks crunched. My NVIDIA card is too old to crunch the newer gravitational wave tasks as it only has 1 Gig of memory. If you want to crunch those, you need a card with more memory, as to how much, I want to say at least 2 Gig, but it might be higher. Last time I tried to crunch GW tasks, the NVIDIA card (GT-1030) failed, but the AMD (RX-560) still worked.

The information Gary provided in the previous link is helpful. There are also web sites which list GPU cards by crunching power. Those sites might also provide some insight.

Clear skies,
Matt
archae86
archae86
Joined: 6 Dec 05
Posts: 3145
Credit: 7023014931
RAC: 1833090

In most configurations,

In most configurations, electric power consumption is a material cost of contributing work to Einstein.  This is true even if the system in question would still be running all the time were Einstein work not being done (as is true for my three) as the GPU, CPU, and memory all burn more power than they would otherwise.

It is yet more true if the system would otherwise be turned off, or not even exist.

It is especially true if the GPU is of a very old generation, as those produce far less useful Einstein output per watt-hour consumed than do the modern ones.

The details vary a lot by location and person.  For example some of us don't see a bill for our power consumption, and think of it as free.

Still, I think the main "is it worth it" objection to using very old GPUs for Einstein is severe power inefficiency.  Investing the money saved by not running the old configuration for a while in modern hardware (especially non bleeding-edge models bought used) could often give higher Einstein output for less money and less energy consumption in the long run.

Tom M
Tom M
Joined: 2 Feb 06
Posts: 5585
Credit: 7672759570
RAC: 1735029

https://www.techpowerup.com/g

https://www.techpowerup.com/gpu-specs/radeon-hd-4670.c234

That is a really small amount of video ram.

Maybe running the CPU would be worth it?

Tom M

A Proud member of the O.F.A.  (Old Farts Association).  Be well, do good work, and keep in touch.® (Garrison Keillor)

Wedge009
Wedge009
Joined: 5 Mar 05
Posts: 117
Credit: 15648580446
RAC: 7420149

I still have HD 4670, albeit

I still have HD 4670, albeit the AGP version (one of the last GPUs to be released for AGP). That aside, it uses TeraScale architecture, which predates even GCN. Assuming you can still obtain drivers with OpenCL support for it (I only recall using it on Windows XP), GCN was where GPGPU on Radeon GPUs really took off. TeraScale was much more focused on graphics over GPGPU, which I think is part of why they didn't consume nearly as much energy as their contemporary counterparts from Nvidia (and also why today's RDNA-based GPUs generally consume less power than their GCN predecessors, with correspondingly less emphasis on GPGPU).

Which is not to say that it isn't possible to use OpenCL applications with TeraScale-based GPUs, I distinctly remember running AMD's Cayman GPUs (HD 6000 series) with BOINC. But I also remember the substantial performance gains when migrating to GCN-based GPUs (Tahiti and later).

As for the question 'is it worth it?' as others have mentioned there's a power consumption consideration involved. I haven't used my HD 4670 in years, and suspect it wouldn't be worth the effort to dust it off just for GPGPU.

Soli Deo Gloria

mikey
mikey
Joined: 22 Jan 05
Posts: 11888
Credit: 1828035366
RAC: 207716

Tom M

Tom M wrote:

https://www.techpowerup.com/gpu-specs/radeon-hd-4670.c234

That is a really small amount of video ram.

Maybe running the CPU would be worth it?

Tom M

At only 512mb of onboard ram, most Boinc projects have set a 1gb memory 'floor', that card isn't even worth trying at most projects.

Wedge009
Wedge009
Joined: 5 Mar 05
Posts: 117
Credit: 15648580446
RAC: 7420149

For what it's worth, I found

For what it's worth, I found iGPUs with at least 512 MiB RAM allocated can still receive work, including E@H.

Soli Deo Gloria

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.